GitXplorerGitXplorer
A

perplexity-gradio

public
5 stars
0 forks
0 issues

Commits

List of commits on branch master.
Unverified
e266d644d43c014d90a7599a41a5580ff625deb7

update pyproject

AAK391 committed 2 months ago
Unverified
4f53cab80691fa8579a30db0ec9fe8abf3d4e9ad

fix image

AAK391 committed 2 months ago
Unverified
3e912b123ba9c12379f56d53329bf632dac4adce

updates

AAK391 committed 2 months ago
Unverified
f23c38a17a1e315906be9ff1f0ea0d92d98c6e77

update working version

AAK391 committed 2 months ago
Unverified
20ccb5b149f9b5b828e8cea8e225d588c4f507cb

update readme

AAK391 committed 3 months ago
Unverified
6f9c8c2fc6292a8fb85c8d3aecb55788c5e79d97

update version

AAK391 committed 3 months ago

README

The README file for this repository.

perplexity-gradio

is a Python package that makes it very easy for developers to create machine learning apps that are powered by Perplexity's API.

Installation

You can install perplexity-gradio directly using pip:

pip install perplexity-gradio

That's it!

Basic Usage

You should first save your Perplexity API key to this environment variable:

export PERPLEXITY_API_KEY=<your token>

Then in a Python file, write:

import gradio as gr
import perplexity_gradio

gr.load(
    name='llama-3.1-sonar-large-128k-online',
    src=perplexity_gradio.registry,
).launch()

Run the Python file, and you should see a Gradio Interface connected to the model on Perplexity!

ChatInterface

Customization

Once you can create a Gradio UI from an OpenAI endpoint, you can customize it by setting your own input and output components, or any other arguments to gr.Interface. For example, the screenshot below was generated with:

import gradio as gr
import perplexity_gradio

gr.load(
    name='llama-3.1-sonar-large-128k-online',
    src=perplexity_gradio.registry,
    title='Perplexity-Gradio Integration',
    description="Chat with llama-3.1-sonar-large-128k-online model.",
    examples=["Explain quantum gravity to a 5-year old.", "How many R are there in the word Strawberry?"]
).launch()

ChatInterface with customizations

Composition

Or use your loaded Interface within larger Gradio Web UIs, e.g.

import gradio as gr
import perplexity_gradio

with gr.Blocks() as demo:
    with gr.Tab("llama-3.1-sonar-large-128k-online"):
        gr.load('llama-3.1-sonar-large-128k-online', src=perplexity_gradio.registry)
    with gr.Tab("llama-3.1-sonar-small-128k-online"):
        gr.load('llama-3.1-sonar-small-128k-online', src=perplexity_gradio.registry)

demo.launch()

Under the Hood

The perplexity-gradio Python library has two dependencies: openai and gradio. It defines a "registry" function perplexity_gradio.registry, which takes in a model name and returns a Gradio app.

Supported Models

For a comprehensive list of available models and their specifications, please refer to the Perplexity Model Cards documentation.

Note: The Online LLMs' search subsystem does not attend to the system prompt. The system prompt can be used to provide instructions related to style, tone, and language of the response.

Note: if you are getting a 401 authentication error, then the OpenAI API Client is not able to get the API token from the environment variable. This happened to me as well, in which case save it in your Python session, like this:

import os

os.environ["PERPLEXITY_API_KEY"] = ...