Hosted Providers
vLLM
To use Open Interpreter with vLLM, you will need to:
pip install vllm
- Set the api_base flag:
- Set the
model
flag:
Supported Models
All models from VLLM should be supported
To use Open Interpreter with vLLM, you will need to:
pip install vllm
model
flag:All models from VLLM should be supported