To use Open Interpreter with DeepInfra, set the model flag:

interpreter --model deepinfra/<deepinfra-model>

Supported Models

We support the following completion models from DeepInfra:

  • Llama-2 70b chat hf
  • Llama-2 7b chat hf
  • Llama-2 13b chat hf
  • CodeLlama 34b instruct awq
  • Mistral 7b instruct v0.1
  • jondurbin/airoboros I2 70b gpt3 1.4.1

interpreter --model deepinfra/meta-llama/Llama-2-70b-chat-hf
interpreter --model deepinfra/meta-llama/Llama-2-7b-chat-hf
interpreter --model deepinfra/meta-llama/Llama-2-13b-chat-hf
interpreter --model deepinfra/codellama/CodeLlama-34b-Instruct-hf
interpreter --model deepinfra/mistral/mistral-7b-instruct-v0.1
interpreter --model deepinfra/jondurbin/airoboros-l2-70b-gpt4-1.4.1

Required Environment Variables

Set the following environment variables (click here to learn how) to use these models.

Environment VariableDescriptionWhere to Find
DEEPINFRA_API_KEY'DeepInfra API keyDeepInfra Dashboard -> API Keys