Guides
Profiles
Profiles are a powerful way to customize your instance of Open Interpreter.
Profiles are Python files that configure Open Interpreter. A wide range of fields from the model to the context window to the message templates can be configured in a Profile. This allows you to save multiple variations of Open Interpreter to optimize for your specific use-cases.
You can access your Profiles by running interpreter --profiles
. This will open the directory where all of your Profiles are stored.
If you want to make your own profile, start with the Template Profile.
To apply a Profile to an Open Interpreter session, you can run interpreter --profile <name>
Example Python Profile
from interpreter import interpreter
interpreter.os = True
interpreter.llm.supports_vision = True
interpreter.llm.model = "gpt-4o"
interpreter.llm.supports_functions = True
interpreter.llm.context_window = 110000
interpreter.llm.max_tokens = 4096
interpreter.auto_run = True
interpreter.loop = True
Example YAML Profile
Make sure YAML profile version is set to 0.2.5
llm:
model: "gpt-4-o"
temperature: 0
# api_key: ... # Your API key, if the API requires it
# api_base: ... # The URL where an OpenAI-compatible server is running to handle LLM API requests
# Computer Settings
computer:
import_computer_api: True # Gives OI a helpful Computer API designed for code interpreting language models
# Custom Instructions
custom_instructions: "" # This will be appended to the system message
# General Configuration
auto_run: False # If True, code will run without asking for confirmation
offline: False # If True, will disable some online features like checking for updates
version: 0.2.5 # Configuration file version (do not modify)
There are many settings that can be configured. See them all here