To use Open Interpreter with a model from OpenAI, simply run:


This will default to gpt-4-turbo, which is the most capable publicly available model for code interpretation (Open Interpreter was designed to be used with gpt-4).

To run a specific model from OpenAI, set the model flag:

interpreter --model gpt-3.5-turbo

Supported Models

We support any model on OpenAI’s models page:

interpreter --model gpt-4o

Required Environment Variables

Set the following environment variables (click here to learn how) to use these models.

Environment VariableDescriptionWhere to Find
OPENAI_API_KEYThe API key for authenticating to OpenAI’s services.OpenAI Account Page