OpenCode Tutorials
Home
Tutorials
Ecosystem
FAQ
Comparisons
Posts
  • Official Website
  • Official Download
  • Official Docs
  • About
  • Contact
  • Privacy Policy
  • Terms of Service
  • Disclaimer
  • Trademark Notice
  • 简体中文
  • English
  • Deutsch
Home
Tutorials
Ecosystem
FAQ
Comparisons
Posts
  • Official Website
  • Official Download
  • Official Docs
  • About
  • Contact
  • Privacy Policy
  • Terms of Service
  • Disclaimer
  • Trademark Notice
  • 简体中文
  • English
  • Deutsch
  • Tutorials

    • Opencode Tutorial Center - From Beginner to Expert
    • Getting Started
    • Advanced Configuration
    • Shortcuts

Provider Configuration Complete Guide

Provider is the bridge connecting Opencode to AI models. Proper Provider configuration enables you to fully leverage Opencode's capabilities. This guide will take you from zero to mastering various Provider configuration methods.

What is a Provider?

A Provider is an AI model "supplier". Opencode supports 75+ Providers, including:

  • Official Subscription Services: Claude Pro/Max, ChatGPT Plus/Pro
  • API Services: DeepSeek, OpenRouter, Groq
  • Local Models: Ollama, LM Studio
  • Enterprise Services: AWS Bedrock, Azure OpenAI, Google Vertex AI

Quick Start: Opencode Zen

If you're a beginner, the simplest way is to use Opencode Zen - the official hosted service:

  1. Run /connect command in Opencode
  2. Select Opencode as Provider
  3. Visit https://opencode.ai/auth to create an API Key
  4. Paste the API Key into Opencode
  5. Run /models to view available models

This method requires no complex configuration and is suitable for quick experience.

Using Existing Subscription: Claude Pro/Max

If you already subscribe to Claude Pro or Claude Max, you can directly use your subscription account without purchasing additional API access:

Configuration Steps

  1. Run command:
opencode auth login
  1. Select Anthropic as Provider

  2. Choose login method: Claude Pro/Max

  3. Complete OAuth authentication in browser

  4. Return to Opencode, run /models to confirm models are loaded

Available Models

  • anthropic/claude-sonnet-4-5 - Latest Sonnet model
  • anthropic/claude-opus-4-5 - Strongest reasoning capability
  • anthropic/claude-haiku-4-5 - Fast response

Using ChatGPT Plus/Pro Subscription

ChatGPT Plus or Pro users can also directly use their subscription:

Prerequisites

Need to install Opencode's OpenAI authentication plugin. Edit config file ~/.config/opencode/opencode.json:

{
  "plugin": [
    "opencode-openai-codex-auth@latest"
  ]
}

Configuration Steps

  1. Restart Opencode to activate plugin

  2. Run:

opencode auth login
  1. Select OpenAI as Provider

  2. Choose ChatGPT Plus/Pro (Codex Subscription)

  3. Log in to your OpenAI account in browser

Available Models

  • openai/gpt-5.2 - Latest flagship model
  • openai/gpt-5.2-codex - Optimized for programming
  • openai/gpt-5.1-codex-max - Maximum context window

Local Models: Ollama

Want completely free and privacy-protected? Ollama is the best choice.

Install Ollama

Visit ollama.ai to download and install, then start the service:

ollama serve

Download Models

# Download Llama 3
ollama pull llama3

# Download DeepSeek Coder (recommended for programming)
ollama pull deepseek-coder

# Download Qwen Coder
ollama pull qwen2.5-coder

Configure Opencode

Edit ~/.config/opencode/opencode.json:

{
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Ollama (Local)",
      "options": {
        "baseURL": "http://localhost:11434/v1"
      },
      "models": {
        "deepseek-coder": {
          "name": "DeepSeek Coder",
          "contextWindow": 16384
        },
        "llama3": {
          "name": "Llama 3",
          "contextWindow": 8192
        }
      }
    }
  }
}

Restart Opencode, run /models to see local models.

Cost-Effective Choice: DeepSeek API

DeepSeek provides highly cost-effective API services at prices far lower than OpenAI.

Get API Key

  1. Visit platform.deepseek.com
  2. Register and recharge (minimum $1.50)
  3. Create API Key

Configuration Method

Run /connect in Opencode:

  1. Select DeepSeek as Provider
  2. Enter your API Key
  3. Complete configuration

Recommended Models

  • deepseek-chat - General conversation model
  • deepseek-coder - Optimized for programming

Multi-Provider Strategy

Production environments recommend configuring multiple Providers for division of labor:

RoleProviderModelPurpose
MainClaude ProOpus 4.5Complex tasks, architecture design
QuickDeepSeekdeepseek-coderSimple modifications, code completion
LocalOllamallama3Privacy-sensitive tasks

Configure in opencode.json:

{
  "model": "anthropic/claude-opus-4-5",
  "small_model": "deepseek/deepseek-coder",
  "enabled_providers": ["anthropic", "deepseek", "ollama"]
}

Opencode will automatically select appropriate models based on task complexity.

Common Questions

Q: How to switch Provider?

Run /models command and select the model you want to use.

Q: Can I use multiple Providers simultaneously?

Yes. Opencode supports configuring multiple Providers and can specify different models for different Agents.

Q: How is local model performance?

For simple tasks, local models like DeepSeek Coder perform excellently. But for complex architecture design tasks, large models like Claude Opus are recommended.

Q: How are API fees calculated?

Different Providers have different billing methods:

  • Claude Pro/ChatGPT Plus: Monthly subscription, no additional fees
  • DeepSeek API: Billed by token, approximately $0.0001/1K tokens
  • Ollama: Completely free

Next Steps

  • Configuration Priority and File Locations
  • Agent Configuration and Division
  • Permissions and Security Settings

This article is compiled by the OpenCodex community. If you have questions, feel free to submit an Issue on GitHub.

Last Updated: 2/28/26, 2:48 PM