Common Issues Troubleshooting
This article summarizes the most common issues and solutions when using Opencode to help you quickly troubleshoot problems.
Installation and Startup Issues
Issue 1: Cannot Start After Installation
Symptoms:
- Double-clicking icon has no response
- Command line execution reports error
- Crashes immediately after startup
Solutions:
macOS:
# Check if blocked by system
xattr -d com.apple.quarantine /Applications/Opencode.app
# Reinstall
brew uninstall opencode
brew install anomalyco/tap/opencode
# Check version
opencode --version
Windows:
# Run as administrator
# Right-click Opencode → Run as administrator
# Or reinstall
choco uninstall opencode
choco install opencode
Linux:
# Check dependencies
ldd /usr/bin/opencode
# Install missing dependencies
sudo apt-get install -f
# Check permissions
chmod +x /usr/bin/opencode
Issue 2: Slow Startup or Lag
Symptoms:
- Startup takes a long time
- High CPU usage after opening folder
- Slow interface response
Possible Causes:
- Monitoring large directories (like
node_modules) - Too many indexed files
- Insufficient memory
Solution:
Edit ~/.config/opencode/opencode.json:
{
"watcher": {
"ignore": [
"node_modules/**",
"dist/**",
"build/**",
".git/**",
".next/**",
"target/**",
"venv/**",
"__pycache__/**",
"*.log"
]
}
}
Restart Opencode for changes to take effect.
Issue 3: Version Too Old
Check version:
opencode --version
Requirement: Version needs to be >= 1.0.150
Update methods:
macOS:
brew upgrade opencode
Windows:
choco upgrade opencode
Linux:
curl -fsSL https://opencode.ai/install | bash
Configuration Issues
Issue 4: JSON Configuration Parse Failure
Symptoms:
- Configuration not taking effect
- Startup error:
JSON parse error - Logs show configuration file error
Possible Causes:
- JSON syntax error (extra commas, missing quotes)
- Comments written in
.jsonfile
Solutions:
Method 1: Use JSONC format
Rename file to .jsonc:
mv ~/.config/opencode/opencode.json ~/.config/opencode/opencode.jsonc
Method 2: Validate JSON syntax
Use online tool: jsonlint.com
Common errors:
❌ Wrong:
{
"model": "claude-sonnet-4-5", // Comments not allowed
"theme": "dark", // Last item has comma
}
✅ Correct:
{
"model": "claude-sonnet-4-5",
"theme": "dark"
}
Or use JSONC:
{
"model": "claude-sonnet-4-5", // Can write comments
"theme": "dark"
}
Issue 5: Configuration Changes Not Taking Effect
Troubleshooting steps:
1. Confirm configuration file location:
# macOS/Linux
ls -la ~/.config/opencode/
# Windows
dir %USERPROFILE%\.config\opencode\
2. Check configuration priority:
Priority (low to high):
- Global config:
~/.config/opencode/opencode.json - Project config:
./opencode.json - Project directory config:
./.opencode/opencode.json
3. Validate configuration:
opencode config list
4. Restart Opencode:
Some configurations (like Provider, plugins) require restart to take effect.
Provider and Authentication Issues
Issue 6: Invalid API Key
Symptoms:
- API Key error prompt
- Cannot connect to model
- Authentication failed
Solutions:
1. Check API Key:
cat ~/.config/opencode/auth.json
2. Re-authenticate:
opencode auth login
3. Check API Key permissions:
- Ensure API Key hasn't expired
- Ensure API Key has sufficient quota
- Ensure API Key has correct permissions
4. Check network connection:
curl https://api.anthropic.com/v1/messages
curl https://api.openai.com/v1/models
Issue 7: Ollama Connection Failed
Symptoms:
- Cannot connect to Ollama prompt
- Model loading failed
Solutions:
1. Confirm Ollama is running:
# Check Ollama process
ps aux | grep ollama
# Start Ollama
ollama serve
2. Check port:
# Default port is 11434
curl http://localhost:11434/api/tags
3. Check configuration:
{
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "http://localhost:11434/v1" // Note /v1 suffix
}
}
}
}
4. Confirm model is downloaded:
# View downloaded models
ollama list
# Download model
ollama pull llama3
Performance Issues
Issue 8: Token Limit Exceeded
Symptoms:
- Error mid-conversation
- Context window full prompt
- Cannot continue conversation
Solutions:
Method 1: Manual compression
/compact
Method 2: Enable auto-compression
Edit opencode.json:
{
"compaction": {
"auto": true,
"prune": true
}
}
Method 3: Use larger model
Switch to model supporting larger context:
- Claude Opus 4.5: 200K tokens
- GPT-5.2: 128K tokens
Method 4: Start new session
/new
Issue 9: Slow Response Speed
Solutions:
1. Check network:
ping api.anthropic.com
2. Switch to faster model:
{
"small_model": "anthropic/claude-haiku-4-5"
}
3. Use local model:
{
"model": "ollama/deepseek-coder"
}
Permission Issues
Issue 10: Cannot Read/Modify Files
Solutions:
1. Check permission configuration:
opencode config get permission
2. Temporarily elevate permissions:
Explicitly state in conversation:
I confirm this operation, please ignore permission restrictions
3. Modify permission configuration:
{
"permission": {
"read": {
"*": "allow"
},
"edit": "ask",
"bash": "ask"
}
}
Diagnostic Tools
Health Check
Run diagnostic commands:
# Check configuration
opencode config validate
# Check Provider
opencode provider list
# Check Agent
opencode agent list
Debug Mode
Enable verbose logging:
{
"logging": {
"level": "debug"
}
}
Getting Help
If none of the above methods solve the problem:
- View official documentation: opencode.ai/docs
- Search community: OpenCodex Community
- Ask questions: Ask in GitHub Issues or community forum
- Contact support: Send email to official support
Next Steps
Compiled by the OpenCodex community. If you encounter new issues, welcome to contribute solutions.