To use Open Interpreter with vLLM, you will need to:

  1. pip install vllm
  2. Set the api_base flag:
interpreter --api_base <https://your-hosted-vllm-server>
  1. Set the model flag:
interpreter --model vllm/<perplexity-model>

Supported Models

All models from VLLM should be supported