To use Open Interpreter with the Cloudflare Workers AI API, set the model flag:

interpreter --model cloudflare/<cloudflare-model>

Supported Models

We support the following completion models from Cloudflare Workers AI:

  • Llama-2 7b chat fp16
  • Llama-2 7b chat int8
  • Mistral 7b instruct v0.1
  • CodeLlama 7b instruct awq

interpreter --model cloudflare/@cf/meta/llama-2-7b-chat-fp16
interpreter --model cloudflare/@cf/meta/llama-2-7b-chat-int8
interpreter --model @cf/mistral/mistral-7b-instruct-v0.1
interpreter --model @hf/thebloke/codellama-7b-instruct-awq

Required Environment Variables

Set the following environment variables (click here to learn how) to use these models.

Environment VariableDescriptionWhere to Find
CLOUDFLARE_API_KEY'Cloudflare API keyCloudflare Profile Page -> API Tokens
CLOUDFLARE_ACCOUNT_IDYour Cloudflare account IDCloudflare Dashboard -> Overview page -> API section