Best Practices Guide
To get a 10x efficiency boost from Opencode, mastering these best practices is crucial.
Model Selection
What is "Opencode Big Pickle"?
Big Pickle Model is a nickname given by the community to a high-performance model configuration used internally by Opencode.
- It is optimized for Long Context.
- It performs excellently in refactoring tasks of the entire repository.
Best Model Recommendations
- Overall Strongest: Claude 3.5 Sonnet (currently top-tier coding capability).
- Best Local Model: DeepSeek Coder V2 (king of cost-performance for local running).
- Best Ollama Model: Llama 3 70B (if you have enough VRAM).
- Opencode Default Model: Usually a fine-tuned GPT-4o or Gemini 1.5 Pro.
Opencode Bedrock
Enterprise users can access Claude models through AWS Bedrock, ensuring data does not leave the private cloud.
Agent Configuration (Agents & Skills)
Master AGENTS.md
This is the soul file of Opencode.
- What: A Markdown file placed in the root directory of the project.
- Why: Tells AI the "hidden rules" of this project.
- How:Write it once, AI remembers forever.
# Project Context We use Vue 3 + Tailwind CSS. All API requests must be encapsulated using `@/utils/request`.
Opencode Skills
Skill refers to tools that AI can call. Through MCP, you can add new Skills to Opencode, such as "Query Database", "Send Slack Message".
Benchmarks
In multiple AI programming benchmarks (such as SWE-bench), Opencode significantly outperforms simple Copilot-like tools in solving complex problems thanks to its unique Build vs Plan architecture.