MCP Hub
Back to servers

ollama-mcp

glama
Updated
Feb 25, 2026

ollama-mcp

An MCP (Model Context Protocol) server that exposes local Ollama instances as tools for Claude Code.

Lets Claude offload code generation, drafts, embeddings, and quick questions to your local GPUs.

Setup

  1. Run the setup script:

    bash setup.sh
    

    This creates a venv, installs dependencies, generates a machine-specific config.json, and registers the MCP server with Claude Code.

    Note: setup.sh uses cygpath and targets Windows (Git Bash / MSYS2). On Linux/macOS, replace the cygpath -w calls with the paths directly, or register manually:

    claude mcp add ollama -s user -- /path/to/.venv/bin/python /path/to/src/ollama_mcp/server.py
    
  2. Restart Claude Code.

Tools

ToolDescription
ollama_generateSingle-turn prompt → response
ollama_chatMulti-turn conversation
ollama_embedGenerate embedding vectors
ollama_list_modelsList models on your Ollama instances

Configuration

Copy config.example.json to config.json and fill in your machine details, or let setup.sh generate it interactively.

Requirements

  • Python 3.10+
  • Ollama 0.4.0+ running on at least one machine
  • Claude Code with MCP support

Development

pip install -e ".[dev]"
pytest tests/ -v

Troubleshooting

ProblemCauseFix
config.json not foundSetup not runRun bash setup.sh
404 on embed callsOllama < 0.4.0Upgrade Ollama (ollama update)
Cannot connect to...Ollama not running on target hostStart Ollama: ollama serve or check Docker
Request timed outLarge model / slow hardwareIncrease timeout in config.json, or pass timeout parameter
OFFLINE in list_modelsHost unreachableCheck network, firewall, Ollama port 11434
cygpath: command not foundRunning setup.sh on Linux/macOSSee setup note above

Reviews

No reviews yet

Sign in to write a review