Skip to main content

Model Providers

The Zoo Code VS Code extension and Zoo Code Cloud Agents are highly capable, sophisticated coding agents. While they carry a lot of functionality on their own, they depend on LLM providers to offer the inference needed to complete tasks.

Other tools are intinsically bound by design to a specific provider (like Claude Code → Anthropic Models or Codex → OpenAI models), forcing you to stick with them irrespective of how the landscape changes (which happens fast). Zoo on the other hand is model-agnostic, allowing you to choose the model that best fits your needs, according to your budget, skill profile, codebase and more.

We support connecting to a wide range of model providers, giving you flexibility in how you access AI models. Some providers work with the VS Code extension, while others are available through Zoo Code Cloud Agents.

Learn how to set up your provider in the Zoo Code VS Code extension here.

Provider Comparison

Here's every provider we support and where you can use them. Click for detailed instructions.

We regularly run evals for all supported models to see how they do against our standard test suite. See the latest results here.

ProviderVS Code ExtensionCloud Agents
Anthropic
ChatGPT Plus/Pro
AWS Bedrock
DeepSeek
Fireworks AI
Google Gemini
LM Studio
LiteLLM
Mistral AI
Moonshot
Ollama
OpenAI
OpenAI Compatible
OpenRouter
Qwen Code CLI
Requesty
SambaNova
Vercel AI Gateway
GCP Vertex AI
VS Code Language Model API
xAI (Grok)
Z AI

Overwhelmed by choice?

Yeah, it's a lot.

  • Want access to many models? Try OpenRouter for a single API to 100+ models
  • Want direct Claude access? Try Anthropic for first-party Claude models
  • Want to optimize for specific models? Use the first part provider for each of them (Anthropic, OpenAI, etc)
  • Looking for local/offline models? Check out Ollama or LM Studio