Opencode Local Models Complete Guide
Use Ollama to run AI models locally with Opencode for complete privacy and zero API costs.
Prerequisites
Install Ollama from ollama.ai
Configuration
Add to your Opencode config:
{
"providers": {
"ollama": {
"model": "codellama",
"baseURL": "http://localhost:11434"
}
}
}
Recommended Models
codellama- Best for code generationdeepseek-coder- Excellent for complex tasksqwen2.5-coder- Great balance of speed and quality