To use Open Interpreter with the Perplexity API, set the model flag:

interpreter --model perplexity/<perplexity-model>

Supported Models

We support the following completion models from the Perplexity API:

  • pplx-7b-chat
  • pplx-70b-chat
  • pplx-7b-online
  • pplx-70b-online
  • codellama-34b-instruct
  • llama-2-13b-chat
  • llama-2-70b-chat
  • mistral-7b-instruct
  • openhermes-2-mistral-7b
  • openhermes-2.5-mistral-7b
  • pplx-7b-chat-alpha
  • pplx-70b-chat-alpha

interpreter --model perplexity/pplx-7b-chat
interpreter --model perplexity/pplx-70b-chat
interpreter --model perplexity/pplx-7b-online
interpreter --model perplexity/pplx-70b-online
interpreter --model perplexity/codellama-34b-instruct
interpreter --model perplexity/llama-2-13b-chat
interpreter --model perplexity/llama-2-70b-chat
interpreter --model perplexity/mistral-7b-instruct
interpreter --model perplexity/openhermes-2-mistral-7b
interpreter --model perplexity/openhermes-2.5-mistral-7b
interpreter --model perplexity/pplx-7b-chat-alpha
interpreter --model perplexity/pplx-70b-chat-alpha

Required Environment Variables

Set the following environment variables (click here to learn how) to use these models.

Environment VariableDescriptionWhere to Find
PERPLEXITYAI_API_KEY'The Perplexity API key from pplx-apiPerplexity API Settings