Provider Configuration Complete Guide
Provider is the bridge connecting Opencode to AI models. Proper Provider configuration enables you to fully leverage Opencode's capabilities. This guide will take you from zero to mastering various Provider configuration methods.
What is a Provider?
A Provider is an AI model "supplier". Opencode supports 75+ Providers, including:
- Official Subscription Services: Claude Pro/Max, ChatGPT Plus/Pro
- API Services: DeepSeek, OpenRouter, Groq
- Local Models: Ollama, LM Studio
- Enterprise Services: AWS Bedrock, Azure OpenAI, Google Vertex AI
Quick Start: Opencode Zen
If you're a beginner, the simplest way is to use Opencode Zen - the official hosted service:
- Run
/connectcommand in Opencode - Select Opencode as Provider
- Visit https://opencode.ai/auth to create an API Key
- Paste the API Key into Opencode
- Run
/modelsto view available models
This method requires no complex configuration and is suitable for quick experience.
Using Existing Subscription: Claude Pro/Max
If you already subscribe to Claude Pro or Claude Max, you can directly use your subscription account without purchasing additional API access:
Configuration Steps
- Run command:
opencode auth login
Select Anthropic as Provider
Choose login method: Claude Pro/Max
Complete OAuth authentication in browser
Return to Opencode, run
/modelsto confirm models are loaded
Available Models
anthropic/claude-sonnet-4-5- Latest Sonnet modelanthropic/claude-opus-4-5- Strongest reasoning capabilityanthropic/claude-haiku-4-5- Fast response
Using ChatGPT Plus/Pro Subscription
ChatGPT Plus or Pro users can also directly use their subscription:
Prerequisites
Need to install Opencode's OpenAI authentication plugin. Edit config file ~/.config/opencode/opencode.json:
{
"plugin": [
"opencode-openai-codex-auth@latest"
]
}
Configuration Steps
Restart Opencode to activate plugin
Run:
opencode auth login
Select OpenAI as Provider
Choose ChatGPT Plus/Pro (Codex Subscription)
Log in to your OpenAI account in browser
Available Models
openai/gpt-5.2- Latest flagship modelopenai/gpt-5.2-codex- Optimized for programmingopenai/gpt-5.1-codex-max- Maximum context window
Local Models: Ollama
Want completely free and privacy-protected? Ollama is the best choice.
Install Ollama
Visit ollama.ai to download and install, then start the service:
ollama serve
Download Models
# Download Llama 3
ollama pull llama3
# Download DeepSeek Coder (recommended for programming)
ollama pull deepseek-coder
# Download Qwen Coder
ollama pull qwen2.5-coder
Configure Opencode
Edit ~/.config/opencode/opencode.json:
{
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"name": "Ollama (Local)",
"options": {
"baseURL": "http://localhost:11434/v1"
},
"models": {
"deepseek-coder": {
"name": "DeepSeek Coder",
"contextWindow": 16384
},
"llama3": {
"name": "Llama 3",
"contextWindow": 8192
}
}
}
}
}
Restart Opencode, run /models to see local models.
Cost-Effective Choice: DeepSeek API
DeepSeek provides highly cost-effective API services at prices far lower than OpenAI.
Get API Key
- Visit platform.deepseek.com
- Register and recharge (minimum $1.50)
- Create API Key
Configuration Method
Run /connect in Opencode:
- Select DeepSeek as Provider
- Enter your API Key
- Complete configuration
Recommended Models
deepseek-chat- General conversation modeldeepseek-coder- Optimized for programming
Multi-Provider Strategy
Production environments recommend configuring multiple Providers for division of labor:
| Role | Provider | Model | Purpose |
|---|---|---|---|
| Main | Claude Pro | Opus 4.5 | Complex tasks, architecture design |
| Quick | DeepSeek | deepseek-coder | Simple modifications, code completion |
| Local | Ollama | llama3 | Privacy-sensitive tasks |
Configure in opencode.json:
{
"model": "anthropic/claude-opus-4-5",
"small_model": "deepseek/deepseek-coder",
"enabled_providers": ["anthropic", "deepseek", "ollama"]
}
Opencode will automatically select appropriate models based on task complexity.
Common Questions
Q: How to switch Provider?
Run /models command and select the model you want to use.
Q: Can I use multiple Providers simultaneously?
Yes. Opencode supports configuring multiple Providers and can specify different models for different Agents.
Q: How is local model performance?
For simple tasks, local models like DeepSeek Coder perform excellently. But for complex architecture design tasks, large models like Claude Opus are recommended.
Q: How are API fees calculated?
Different Providers have different billing methods:
- Claude Pro/ChatGPT Plus: Monthly subscription, no additional fees
- DeepSeek API: Billed by token, approximately $0.0001/1K tokens
- Ollama: Completely free
Next Steps
- Configuration Priority and File Locations
- Agent Configuration and Division
- Permissions and Security Settings
This article is compiled by the OpenCodex community. If you have questions, feel free to submit an Issue on GitHub.