Hosted Providers
Perplexity
To use Open Interpreter with the Perplexity API, set the model
flag:
Supported Models
We support the following completion models from the Perplexity API:
- pplx-7b-chat
- pplx-70b-chat
- pplx-7b-online
- pplx-70b-online
- codellama-34b-instruct
- llama-2-13b-chat
- llama-2-70b-chat
- mistral-7b-instruct
- openhermes-2-mistral-7b
- openhermes-2.5-mistral-7b
- pplx-7b-chat-alpha
- pplx-70b-chat-alpha
Required Environment Variables
Set the following environment variables (click here to learn how) to use these models.
Environment Variable | Description | Where to Find |
---|---|---|
PERPLEXITYAI_API_KEY' | The Perplexity API key from pplx-api | Perplexity API Settings |