Skip to main content
To use Open Interpreter with a model from OpenRouter, set the model flag to begin with openrouter/:
interpreter --model openrouter/openai/gpt-3.5-turbo

Supported Models

We support any model on OpenRouter’s models page:
interpreter --model openrouter/openai/gpt-3.5-turbo
interpreter --model openrouter/openai/gpt-3.5-turbo-16k
interpreter --model openrouter/openai/gpt-4
interpreter --model openrouter/openai/gpt-4-32k
interpreter --model openrouter/anthropic/claude-2
interpreter --model openrouter/anthropic/claude-instant-v1
interpreter --model openrouter/google/palm-2-chat-bison
interpreter --model openrouter/google/palm-2-codechat-bison
interpreter --model openrouter/meta-llama/llama-2-13b-chat
interpreter --model openrouter/meta-llama/llama-2-70b-chat

Required Environment Variables

Set the following LiteLLM environment variables (click here to learn how) to use these models.
Environment VariableDescriptionWhere to Find
OPENROUTER_API_KEYThe API key for authenticating to OpenRouter’s services.OpenRouter Account Page
OR_SITE_URLAn optional app URL for tracking usage, such as https://github.com/openinterpreter/open-interpreter/.Your choice
OR_APP_NAMEAn optional app name for tracking usage, such as "Open Interpreter".Your choice