Ollama is an easy way to get local language models running on your computer through a command-line interface. To run Ollama with Open interpreter:Documentation Index
Fetch the complete documentation index at: https://docs.openinterpreter.com/llms.txt
Use this file to discover all available pages before exploring further.
- Download Ollama for your platform from here.
- Open the installed Ollama application, and go through the setup, which will require your password.
- Now you are ready to download a model. You can view all available models here. To download a model, run:
- It will likely take a while to download, but once it does, we are ready to use it with Open Interpreter. You can either run
interpreter --localto set it up interactively in the terminal, or do it manually:

