Open the installed Ollama application, and go through the setup, which will require your password.
Now you are ready to download a model. You can view all available models here. To download a model, run:
Copy
Ask AI
ollama run <model-name>
It will likely take a while to download, but once it does, we are ready to use it with Open Interpreter. You can either run interpreter --local to set it up interactively in the terminal, or do it manually:
Copy
Ask AI
interpreter --model ollama/<model-name>
For any future runs with Ollama, ensure that the Ollama server is running. If using the desktop application, you can check to see if the Ollama menu bar item is active.
If Ollama is producing strange output, make sure to update to the latest
version