To use Open Interpreter with a model from OpenRouter, set the model flag to begin with openrouter/:

interpreter --model openrouter/openai/gpt-3.5-turbo

Supported Models

We support any model on OpenRouter’s models page:

interpreter --model openrouter/openai/gpt-3.5-turbo
interpreter --model openrouter/openai/gpt-3.5-turbo-16k
interpreter --model openrouter/openai/gpt-4
interpreter --model openrouter/openai/gpt-4-32k
interpreter --model openrouter/anthropic/claude-2
interpreter --model openrouter/anthropic/claude-instant-v1
interpreter --model openrouter/google/palm-2-chat-bison
interpreter --model openrouter/google/palm-2-codechat-bison
interpreter --model openrouter/meta-llama/llama-2-13b-chat
interpreter --model openrouter/meta-llama/llama-2-70b-chat

Required Environment Variables

Set the following environment variables (click here to learn how) to use these models.

Environment VariableDescriptionWhere to Find
OPENROUTER_API_KEYThe API key for authenticating to OpenRouter’s services.OpenRouter Account Page
OR_SITE_URLThe site URL for OpenRouter’s services.OpenRouter Account Page
OR_APP_NAMEThe app name for OpenRouter’s services.OpenRouter Account Page