Open Interpreter home page
Search...
⌘K
Ask AI
50K ★ GitHub
Join Discord
Join Discord
Search...
Navigation
Language Models
Settings
Getting Started
Introduction
Setup
Guides
Basic Usage
Running Locally
Profiles
Streaming Response
Advanced Terminal Usage
Multiple Instances
OS Mode
Settings
All Settings
Language Models
Introduction
Hosted Providers
Local Providers
Custom Models
Settings
Code Execution
Usage
Computer API
Custom Languages
Settings
Protocols
LMC Messages
Integrations
E2B
Docker
Safety
Introduction
Isolation
Safe Mode
Best Practices
Troubleshooting
FAQ
Telemetry
Introduction
Language Models
Settings
The
interpreter.llm
is responsible for running the language model.
Click here
to view
interpreter.llm
settings.
Suggest edits
Custom Models
Usage
Assistant
Responses are generated using AI and may contain mistakes.