Open Interpreter home page
Search...
⌘K
Ask AI
50K ★ GitHub
Join Discord
Join Discord
Search...
Navigation
Hosted Providers
Together AI
Getting Started
Introduction
Setup
Guides
Basic Usage
Running Locally
Profiles
Streaming Response
Advanced Terminal Usage
Multiple Instances
OS Mode
Settings
All Settings
Language Models
Introduction
Hosted Providers
OpenAI
Azure
Google (Vertex AI)
Replicate
Together AI
Mistral AI API
Anthropic
Anyscale
AWS Sagemaker
Baseten
Cloudflare Workers AI
Cohere
AI21
DeepInfra
Huggingface
NLP Cloud
OpenRouter
PaLM API - Google
Perplexity
Petals
vLLM
Local Providers
Custom Models
Settings
Code Execution
Usage
Computer API
Custom Languages
Settings
Protocols
LMC Messages
Integrations
E2B
Docker
Safety
Introduction
Isolation
Safe Mode
Best Practices
Troubleshooting
FAQ
Telemetry
Introduction
On this page
Supported Models
Required Environment Variables
Hosted Providers
Together AI
To use Open Interpreter with Together AI, set the
model
flag:
Terminal
Python
Copy
Ask AI
interpreter
--model
together_ai/
<
together_ai-mode
l
>
Supported Models
All models on Together AI are supported.
Required Environment Variables
Set the following environment variables
(click here to learn how)
to use these models.
Environment Variable
Description
Where to Find
TOGETHERAI_API_KEY'
The TogetherAI API key from the Settings page
TogetherAI -> Profile -> Settings -> API Keys
Suggest edits
Replicate
Mistral AI API
Assistant
Responses are generated using AI and may contain mistakes.