Open Interpreter home pagelight logodark logo
  • 50K ★ GitHub
  • Join Discord
  • Join Discord
Local Providers
Custom Endpoint
Getting Started
  • Introduction
  • Setup
Guides
  • Basic Usage
  • Running Locally
  • Profiles
  • Streaming Response
  • Advanced Terminal Usage
  • Multiple Instances
  • OS Mode
Settings
  • All Settings
Language Models
  • Introduction
  • Hosted Providers
  • Local Providers
    • Ollama
    • LlamaFile
    • Jan.ai
    • LM Studio
    • Custom Endpoint
    • Best Practices
  • Custom Models
  • Settings
Code Execution
  • Usage
  • Computer API
  • Custom Languages
  • Settings
Protocols
  • LMC Messages
Integrations
  • E2B
  • Docker
Safety
  • Introduction
  • Isolation
  • Safe Mode
  • Best Practices
Troubleshooting
  • FAQ
Telemetry
  • Introduction
Local Providers

Custom Endpoint

Simply set api_base to any OpenAI compatible server:

interpreter --api_base <custom_endpoint>
Suggest edits
LM StudioBest Practices
twitteryoutubelinkedin
Powered by Mintlify