AI Runners
Integrate local AI runners like Ollama, LocalAI, and LM Studio with LocalDNS for seamless .local domain access and MCP configuration.
Local AI Runners Integration
LocalDNS seamlessly integrates with local sovereign AI systems, making them discoverable and accessible via .local domains.
Supported Runners
| Runner | Default Port | Domain | MCP Compatible |
|---|---|---|---|
| Ollama | 11434 | ollama.local | Yes |
| LocalAI | 8080 | localai.local | Yes |
| Jan.ai | 8000 | jan.local | Yes |
| LM Studio | 1234 | lmstudio.local | Yes |
| vLLM | 8000 | vllm.local | Yes |
Quick Start
1. Scan for Running AI Runners
localdns ai scanThis scans common ports, detects running runners, shows available models, and displays MCP configuration.
2. Auto-Register Detected Runners
localdns ai scan --register3. Register Specific Runner
localdns ai register ollama
localdns ai register localai --port 9000
localdns ai register ollama --name my-ai.local4. Generate MCP Configuration
localdns ai mcp
localdns ai mcp --output ~/.cursor/mcp.jsonMCP Integration
The ai mcp command generates a configuration compatible with Cursor, VSCode, and other MCP clients:
{
"mcpServers": {
"local-ollama": {
"url": "http://localhost:11434",
"description": "Ollama - Local LLM runner"
},
"local-localai": {
"url": "http://localhost:8080",
"description": "LocalAI - OpenAI-compatible API server"
}
}
}Setup Steps
- Scan for runners:
localdns ai scan - Copy the MCP config from the output
- Add to your MCP config:
- Cursor:
~/.cursor/mcp.json - VSCode:
.vscode/mcp.json
- Cursor:
- Reload your IDE
Using AI Runners via LocalDNS
Once registered, access AI runners via their .local domains:
curl http://ollama.local/api/chat
curl http://localai.local/v1/models
curl http://jan.local/v1/chat/completionsStandard Endpoints
All runners expose OpenAI-compatible endpoints:
- Chat:
POST /v1/chat/completions(or/api/chatfor Ollama) - Models:
GET /v1/models(or/api/tagsfor Ollama) - Completions:
POST /v1/completions(legacy)
Troubleshooting
Runner Not Detected
# Check if runner is running
curl http://localhost:11434/api/tags # Ollama
curl http://localhost:8080/v1/models # LocalAI
# Check port
lsof -i :11434
# Try manual registration
localdns ai register ollama --port 11434MCP Not Connecting
- Verify MCP config:
localdns ai mcp - Check URL format - must be
http://localhost:PORT - Reload IDE after updating MCP config