Language Models
Custom Models
In addition to hosted and local language models, Open Interpreter also supports custom models.
As long as your system can accept an input and stream an output (and can be interacted with via a Python generator) it can be used as a language model in Open Interpreter.
Simply replace the OpenAI-compatible completions
function in your language model with one of your own:
Then, set the following settings:
And start using it: