To use Open Interpreter with vLLM, you will need to:
pip install vllm
- Set the api_base flag:
interpreter --api_base <https://your-hosted-vllm-server>
- Set the
model
flag:
interpreter --model vllm/<perplexity-model>
Supported Models
All models from VLLM should be supported