Hosted Providers
OpenAI
To use Open Interpreter with a model from OpenAI, simply run:
This will default to gpt-4-turbo
, which is the most capable publicly available model for code interpretation (Open Interpreter was designed to be used with gpt-4
).
To run a specific model from OpenAI, set the model
flag:
Supported Models
We support any model on OpenAI’s models page:
Required Environment Variables
Set the following environment variables (click here to learn how) to use these models.
Environment Variable | Description | Where to Find |
---|---|---|
OPENAI_API_KEY | The API key for authenticating to OpenAI’s services. | OpenAI Account Page |