Profiles are a powerful way to customize your instance of Open Interpreter.
Profiles are Python files that configure Open Interpreter. A wide range of fields from the model to the context window to the message templates can be configured in a Profile. This allows you to save multiple variations of Open Interpreter to optimize for your specific use-cases.
You can access your Profiles by running interpreter --profiles. This will open the directory where all of your Profiles are stored.
If you want to make your own profile, start with the Template Profile.
To apply a Profile to an Open Interpreter session, you can run interpreter --profile <name>
llm: model: "gpt-4-o" temperature: 0 # api_key: ... # Your API key, if the API requires it # api_base: ... # The URL where an OpenAI-compatible server is running to handle LLM API requests# Computer Settingscomputer: import_computer_api: True # Gives OI a helpful Computer API designed for code interpreting language models# Custom Instructionscustom_instructions: "" # This will be appended to the system message# General Configurationauto_run: False # If True, code will run without asking for confirmationoffline: False # If True, will disable some online features like checking for updatesversion: 0.2.5 # Configuration file version (do not modify)