MCP Ollama Consult Server
An MCP (Model Context Protocol) server that allows consulting with Ollama models for reasoning from alternative viewpoints.
Features
- consult_ollama: Send prompts to Ollama models and get responses
- list_ollama_models: List available models on the local Ollama instance
- compare_ollama_models: Run the ame prompt against multiple Ollama models and return their outputs side-by-side for comparison
- remember_consult: Store the result of a consult into a local memory store (or configured memory service)
Installation
-
Install the server:
npm i -g https://github.com/Atomic-Germ/mcp-consult/releases/download/v1.0.1/mcp-ollama-consult-1.0.1.tgz -
Configure the server:
{ "servers": { "ollama-consult": { "type": "stdio", "command": "mcp-ollama-consult", "args": [] } }, "inputs": [] }
Usage
Make sure Ollama is running locally (default: (http://localhost:11434).
Start the MCP server:
mcp-ollama-consult
Or for development:
npm run dev
Configuration
Set the OLLAMA_BASE_URL environment variable to change the Ollama endpoint:
OLLAMA_BASE_URL=http://your-ollama-server:11434 npm start
Docker
To run with Docker, build the image:
FROM node:18-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY dist/ ./dist/
CMD ["node", "dist/index.js"]
Requirements
- Node.js 18+
- Ollama running locally or accessible via HTTP