OpenCode Tutorials
Home
Tutorials
Ecosystem
FAQ
Comparisons
Posts
  • Official Website
  • Official Download
  • Official Docs
  • About
  • Contact
  • Privacy Policy
  • Terms of Service
  • Disclaimer
  • Trademark Notice
  • 简体中文
  • English
  • Deutsch
Home
Tutorials
Ecosystem
FAQ
Comparisons
Posts
  • Official Website
  • Official Download
  • Official Docs
  • About
  • Contact
  • Privacy Policy
  • Terms of Service
  • Disclaimer
  • Trademark Notice
  • 简体中文
  • English
  • Deutsch
  • Latest Posts

    • Opencode Blog Articles - AI Programming Insights
    • Welcome to OpenCodex

Opencode Local Models Complete Guide

Use Ollama to run AI models locally with Opencode for complete privacy and zero API costs.

Prerequisites

Install Ollama from ollama.ai

Configuration

Add to your Opencode config:

{
  "providers": {
    "ollama": {
      "model": "codellama",
      "baseURL": "http://localhost:11434"
    }
  }
}

Recommended Models

  • codellama - Best for code generation
  • deepseek-coder - Excellent for complex tasks
  • qwen2.5-coder - Great balance of speed and quality

Next Steps

  • Provider Setup Guide
  • Advanced Configuration
Last Updated: 2/28/26, 2:48 PM