AI Features¶
Configuring AI assistants and code completion in Zed.
Overview¶
Zed provides three types of AI integration:
- Edit Predictions: Inline code completions (Copilot, Supermaven)
- Agent: AI chat assistant for code questions and generation
- Inline Assist: AI-powered code transformations
Edit Predictions (Copilot)¶
Enable Copilot¶
Providers¶
| Provider | Description |
|---|---|
copilot | GitHub Copilot |
supermaven | Supermaven (fast completions) |
zed | Zed's built-in completions |
none | Disable predictions |
Sign In to Copilot¶
- Command Palette (Cmd+Shift+P)
- Search "copilot: sign in"
- Follow authentication flow
Copilot Settings¶
{
"features": {
"edit_prediction_provider": "copilot"
},
"copilot": {
"disabled_globs": [
".env",
"*.pem",
"*.key"
]
}
}
Using Completions¶
- Completions appear as gray text
- Press Tab to accept
- Press Esc to dismiss
- Continue typing to ignore
Agent (AI Assistant)¶
Enable Agent¶
{
"agent": {
"enabled": true,
"default_model": {
"provider": "anthropic",
"model": "claude-sonnet-4-20250514"
}
}
}
Available Providers¶
| Provider | Models |
|---|---|
anthropic | claude-sonnet-4-20250514, claude-opus-4-20250514 |
openai | gpt-4o, gpt-4-turbo, gpt-3.5-turbo |
copilot_chat | gpt-4o (via Copilot subscription) |
ollama | Local models |
google | gemini-pro, gemini-ultra |
Anthropic (Claude) Setup¶
- Get API key from Anthropic Console
- Configure in Zed settings or environment:
Set API key:
Or via Zed: Command Palette > "assistant: configure api key"
OpenAI Setup¶
Set API key:
Copilot Chat (No Extra Cost)¶
Use Copilot subscription for chat:
Requires Copilot sign-in.
Ollama (Local Models)¶
Run models locally with Ollama:
- Install Ollama:
brew install ollama - Start server:
ollama serve - Pull model:
ollama pull llama3.2 - Configure:
Using the Agent¶
Open Agent panel:
- Cmd+Shift+A or
- Command Palette > "assistant: new conversation"
In the Agent:
- Type questions naturally
- Reference code with
@file.py - Use
/commands for actions - Click "Apply" to insert code suggestions
Agent Slash Commands¶
| Command | Action |
|---|---|
/file | Include file content |
/tab | Include open tab content |
/selection | Include selected text |
/diagnostics | Include current diagnostics |
/terminal | Include terminal output |
/fetch | Fetch URL content |
Inline Assist¶
Transform code with AI directly in the editor.
Trigger Inline Assist¶
- Select code
- Press Cmd+Enter or Ctrl+Enter
- Type instruction
- Press Enter to apply
Examples¶
- Select function > "Add error handling"
- Select code > "Convert to async/await"
- Select block > "Add TypeScript types"
- Cursor in function > "Write tests for this function"
Inline Assist Settings¶
Recommended Configuration¶
For Claude Users¶
{
"features": {
"edit_prediction_provider": "copilot"
},
"agent": {
"enabled": true,
"default_model": {
"provider": "anthropic",
"model": "claude-sonnet-4-20250514"
},
"version": "2"
}
}
For Copilot-Only Users¶
{
"features": {
"edit_prediction_provider": "copilot"
},
"agent": {
"enabled": true,
"default_model": {
"provider": "copilot_chat",
"model": "gpt-4o"
}
}
}
For Local-Only (Ollama)¶
{
"features": {
"edit_prediction_provider": "none"
},
"agent": {
"enabled": true,
"default_model": {
"provider": "ollama",
"model": "codellama:34b"
}
}
}
Disable All AI¶
Multiple Model Configuration¶
Configure multiple models for different tasks:
{
"agent": {
"default_model": {
"provider": "anthropic",
"model": "claude-sonnet-4-20250514"
},
"models": {
"anthropic": {
"available_models": [
{
"name": "claude-sonnet-4-20250514",
"max_tokens": 8192
},
{
"name": "claude-opus-4-20250514",
"max_tokens": 4096
}
]
}
}
}
}
Switch models in Agent panel using the model selector.
Privacy Considerations¶
Disable Telemetry¶
Exclude Sensitive Files from Copilot¶
{
"copilot": {
"disabled_globs": [
".env*",
"*.pem",
"*.key",
"**/secrets/**",
"**/credentials/**"
]
}
}
Troubleshooting¶
Copilot Not Working¶
- Check sign-in status: Command Palette > "copilot: status"
- Re-authenticate: Command Palette > "copilot: sign out" then sign in
- Check subscription is active
Agent Not Responding¶
- Verify API key is set
- Check network connectivity
- View logs: Command Palette > "zed: open log"
Ollama Connection Failed¶
- Ensure Ollama is running:
ollama serve - Verify model is pulled:
ollama list - Check default port (11434) is available
Keybindings¶
| Key | Action |
|---|---|
| Cmd+Shift+A | Open Agent panel |
| Cmd+Enter | Inline assist (with selection) |
| Tab | Accept completion |
| Esc | Dismiss completion |