Advanced Configuration
Once you are familiar with the basic operations of Opencode, you can customize a more powerful workflow through configuration files.
settings.json
Opencode is compatible with the settings.json configuration system of VS Code, while adding some AI-specific configurations.
{
"editor.fontSize": 14,
"opencode.ai.model": "gpt-4-turbo",
"opencode.ai.autoSuggest": true,
"opencode.terminal.shell": "zsh"
}
Configuring Custom Models
Using Ollama (Local Models)
Opencode perfectly supports Ollama, which means you can run Llama 3 or DeepSeek locally for free.
- Ensure you have installed and started Ollama (
ollama serve). - Run
/connectin Opencode or open Settings. - Add a new Provider and select OpenAI Compatible.
- Fill in the configuration:
- Base URL:
http://localhost:11434/v1 - API Key:
ollama(fill in anything) - Model ID:
llama3ordeepseek-coder
- Base URL:
Using DeepSeek Official API
If you want to use DeepSeek's cloud API:
- Base URL:
https://api.deepseek.com/v1 - API Key: Fill in your DeepSeek Key
Configuration File Location
If you prefer to modify files directly, you can find config.json at:
- macOS/Linux:
~/.opencode/config.json - Windows:
%USERPROFILE%\.opencode\config.json
Keybindings
It is recommended to bind common AI functions to handy shortcuts:
Chat View:Cmd+LInline Edit:Cmd+KExplain Code:Cmd+.
Recommended Extensions
- Python: Official Python extension
- GitLens: Enhanced Git capabilities
- Prettier: Code formatting