MCP Hub
Back to servers

mcp-consult

A Model Context Protocol server that facilitates interactions with local Ollama instances, enabling multi-model reasoning, comparison, and memory storage of model outputs.

Forks
1
Tools
4
Updated
Jan 2, 2026
Validated
Jan 9, 2026

MCP Ollama Consult Server

TypeScript MCP CI/CD Ollama Consult Server MCP

An MCP (Model Context Protocol) server that allows consulting with Ollama models for reasoning from alternative viewpoints.

Features

  • consult_ollama: Send prompts to Ollama models and get responses
  • list_ollama_models: List available models on the local Ollama instance
  • compare_ollama_models: Run the ame prompt against multiple Ollama models and return their outputs side-by-side for comparison
  • remember_consult: Store the result of a consult into a local memory store (or configured memory service)

Installation

  1. Install the server:

    npm i -g https://github.com/Atomic-Germ/mcp-consult/releases/download/v1.0.1/mcp-ollama-consult-1.0.1.tgz
    
  2. Configure the server:

    {
     "servers": {
     	"ollama-consult": {
     		"type": "stdio",
     		"command": "mcp-ollama-consult",
     		"args": []
     	}
     },
     "inputs": []
    }
    

Usage

Make sure Ollama is running locally (default: (http://localhost:11434).

Start the MCP server:

mcp-ollama-consult

Or for development:

npm run dev

Configuration

Set the OLLAMA_BASE_URL environment variable to change the Ollama endpoint:

OLLAMA_BASE_URL=http://your-ollama-server:11434 npm start

Docker

To run with Docker, build the image:

FROM node:18-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY dist/ ./dist/
CMD ["node", "dist/index.js"]

Requirements

  • Node.js 18+
  • Ollama running locally or accessible via HTTP

Reviews

No reviews yet

Sign in to write a review