OpenCodex
Home
Tutorials
Ecosystem
FAQ
Features
Comparisons
Posts
Projects
  • Official
  • Download
  • Docs
  • 简体中文
  • English
Home
Tutorials
Ecosystem
FAQ
Features
Comparisons
Posts
Projects
  • Official
  • Download
  • Docs
  • 简体中文
  • English
  • Tutorials

    • Tutorials
    • Getting Started
    • Advanced Configuration
    • Shortcuts

Advanced Configuration

Once you are familiar with the basic operations of Opencode, you can customize a more powerful workflow through configuration files.

settings.json

Opencode is compatible with the settings.json configuration system of VS Code, while adding some AI-specific configurations.

{
  "editor.fontSize": 14,
  "opencode.ai.model": "gpt-4-turbo",
  "opencode.ai.autoSuggest": true,
  "opencode.terminal.shell": "zsh"
}

Configuring Custom Models

Using Ollama (Local Models)

Opencode perfectly supports Ollama, which means you can run Llama 3 or DeepSeek locally for free.

  1. Ensure you have installed and started Ollama (ollama serve).
  2. Run /connect in Opencode or open Settings.
  3. Add a new Provider and select OpenAI Compatible.
  4. Fill in the configuration:
    • Base URL: http://localhost:11434/v1
    • API Key: ollama (fill in anything)
    • Model ID: llama3 or deepseek-coder

Using DeepSeek Official API

If you want to use DeepSeek's cloud API:

  1. Base URL: https://api.deepseek.com/v1
  2. API Key: Fill in your DeepSeek Key

Configuration File Location

If you prefer to modify files directly, you can find config.json at:

  • macOS/Linux: ~/.opencode/config.json
  • Windows: %USERPROFILE%\.opencode\config.json

Keybindings

It is recommended to bind common AI functions to handy shortcuts:

  • Chat View: Cmd+L
  • Inline Edit: Cmd+K
  • Explain Code: Cmd+.

Recommended Extensions

  • Python: Official Python extension
  • GitLens: Enhanced Git capabilities
  • Prettier: Code formatting
Last Updated: 1/15/26, 4:12 PM
Contributors: souvc
Prev
Getting Started
Next
Shortcuts