LocalDNS.wtf

AI Runners

Integrate local AI runners like Ollama, LocalAI, and LM Studio with LocalDNS for seamless .local domain access and MCP configuration.

Local AI Runners Integration

LocalDNS seamlessly integrates with local sovereign AI systems, making them discoverable and accessible via .local domains.

Supported Runners

RunnerDefault PortDomainMCP Compatible
Ollama11434ollama.localYes
LocalAI8080localai.localYes
Jan.ai8000jan.localYes
LM Studio1234lmstudio.localYes
vLLM8000vllm.localYes

Quick Start

1. Scan for Running AI Runners

localdns ai scan

This scans common ports, detects running runners, shows available models, and displays MCP configuration.

2. Auto-Register Detected Runners

localdns ai scan --register

3. Register Specific Runner

localdns ai register ollama
localdns ai register localai --port 9000
localdns ai register ollama --name my-ai.local

4. Generate MCP Configuration

localdns ai mcp
localdns ai mcp --output ~/.cursor/mcp.json

MCP Integration

The ai mcp command generates a configuration compatible with Cursor, VSCode, and other MCP clients:

{
  "mcpServers": {
    "local-ollama": {
      "url": "http://localhost:11434",
      "description": "Ollama - Local LLM runner"
    },
    "local-localai": {
      "url": "http://localhost:8080",
      "description": "LocalAI - OpenAI-compatible API server"
    }
  }
}

Setup Steps

  1. Scan for runners: localdns ai scan
  2. Copy the MCP config from the output
  3. Add to your MCP config:
    • Cursor: ~/.cursor/mcp.json
    • VSCode: .vscode/mcp.json
  4. Reload your IDE

Using AI Runners via LocalDNS

Once registered, access AI runners via their .local domains:

curl http://ollama.local/api/chat
curl http://localai.local/v1/models
curl http://jan.local/v1/chat/completions

Standard Endpoints

All runners expose OpenAI-compatible endpoints:

  • Chat: POST /v1/chat/completions (or /api/chat for Ollama)
  • Models: GET /v1/models (or /api/tags for Ollama)
  • Completions: POST /v1/completions (legacy)

Troubleshooting

Runner Not Detected

# Check if runner is running
curl http://localhost:11434/api/tags  # Ollama
curl http://localhost:8080/v1/models  # LocalAI

# Check port
lsof -i :11434

# Try manual registration
localdns ai register ollama --port 11434

MCP Not Connecting

  1. Verify MCP config: localdns ai mcp
  2. Check URL format - must be http://localhost:PORT
  3. Reload IDE after updating MCP config