AI Coding Tools¶
Configure AI-powered coding assistants to use local LLM backends.
Overview¶
AI coding tools can connect to:
- Local Ollama - Self-hosted, private
- Local llama.cpp - Custom configurations
- OpenAI-compatible APIs - Any compatible endpoint
- Cloud APIs - Anthropic, OpenAI (fallback)
Tool Comparison¶
| Tool | Interface | Local Model Support | Best For |
|---|---|---|---|
| Claude Code | CLI | Via API proxy | Anthropic CLI users |
| Aider | CLI | Native Ollama | Git-integrated coding |
| Cline | VS Code | Native | VS Code users |
| Continue.dev | Multi-editor | Native | IDE integration |
Feature Matrix¶
┌─────────────────────────────────────────────────────────────────┐
│ AI Coding Tools │
├─────────────────────────────────────────────────────────────────┤
│ │
│ CLI Tools: │
│ ┌─────────────────────┐ ┌─────────────────────┐ │
│ │ Claude Code │ │ Aider │ │
│ ├─────────────────────┤ ├─────────────────────┤ │
│ │ ✓ Official Claude │ │ ✓ Git-aware │ │
│ │ ✓ Multi-file edits │ │ ✓ Local models │ │
│ │ ✓ Code execution │ │ ✓ Auto-commits │ │
│ │ ✓ File operations │ │ ✓ Multiple LLMs │ │
│ └─────────────────────┘ └─────────────────────┘ │
│ │
│ IDE Extensions: │
│ ┌─────────────────────┐ ┌─────────────────────┐ │
│ │ Cline │ │ Continue.dev │ │
│ ├─────────────────────┤ ├─────────────────────┤ │
│ │ ✓ VS Code │ │ ✓ VS Code + JetBrains│ │
│ │ ✓ Plan/Act modes │ │ ✓ Autocomplete │ │
│ │ ✓ Local providers │ │ ✓ Chat + Edit │ │
│ │ ✓ MCP support │ │ ✓ Local-first │ │
│ └─────────────────────┘ └─────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────┘
Recommended Setup¶
Development Workflow¶
┌─────────────────┐
│ Code Question │
└────────┬────────┘
│
┌────────────────┼────────────────┐
│ │ │
▼ ▼ ▼
┌──────────┐ ┌──────────┐ ┌──────────┐
│ Continue │ │ Aider │ │ Cline │
│ (IDE) │ │ (CLI) │ │ (VS Code)│
└────┬─────┘ └────┬─────┘ └────┬─────┘
│ │ │
└────────────────┴────────────────┘
│
┌──────┴──────┐
│ Ollama │
│ (Local LLM) │
└─────────────┘
Model Recommendations¶
| Task | Model | Notes |
|---|---|---|
| Code completion | DeepSeek Coder V2 16B | Fast, accurate |
| Code chat | Llama 3.3 70B | Good reasoning |
| Refactoring | Qwen 2.5 Coder 32B | Balanced |
| Quick answers | Mistral 7B | Very fast |
Quick Configuration¶
Environment Variables¶
# For OpenAI-compatible tools
export OPENAI_API_BASE=http://localhost:11434/v1
export OPENAI_API_KEY=not-needed
# For Ollama-native tools
export OLLAMA_HOST=http://localhost:11434
Ollama Setup¶
Ensure Ollama is running with your preferred model:
# Start Ollama
ollama serve
# Pull coding model
ollama pull deepseek-coder-v2:16b
# Verify
curl http://localhost:11434/v1/models
Topics¶
-
Claude Code
Anthropic's official CLI for Claude
-
Aider
Git-smart AI pair programming in terminal
-
Cline
VS Code extension with Plan/Act modes
-
Continue.dev
Multi-editor extension with local-first config
Model Configuration Examples¶
For Aider¶
# Using Ollama
aider --model ollama/deepseek-coder-v2:16b
# Using OpenAI-compatible
aider --openai-api-base http://localhost:8080/v1 --model llama3.3
For Continue.dev¶
{
"models": [
{
"title": "DeepSeek Coder",
"provider": "ollama",
"model": "deepseek-coder-v2:16b"
}
]
}
For Cline¶
Settings → Cline → API Provider → Ollama
See Also¶
- Inference Engines - Backend setup
- Ollama - Local model serving
- API Serving - OpenAI-compatible APIs
- Choosing Models - Model selection