Language Models
Introduction
Open Interpreter works with both hosted and local language models.
Hosted models are faster and more capable, but require payment. Local models are private and free, but are often less capable.
For this reason, we recommend starting with a hosted model, then switching to a local model once you’ve explored Open Interpreter’s capabilities.
Hosted setup
Connect to a hosted language model like GPT-4 (recommended)
Local setup
Setup a local language model like Mistral
Thank you to the incredible LiteLLM team for their efforts in connecting Open Interpreter to hosted providers.