MCP Hub
Back to servers

mcp-local-analyst

A private, local AI Data Analyst that converts natural language into SQL queries using Ollama and SQLite. It provides a Dockerized environment including a Streamlit UI for secure, offline data interaction.

Stars
3
Forks
2
Updated
Dec 27, 2025
Validated
Feb 5, 2026

MCP Local Analyst

Demo

Talk to your data locally 💬📊. A private AI Data Analyst built with the Model Context Protocol (MCP), Ollama, and SQLite. Turn natural language into SQL queries without data leaving your machine. Includes a Dockerized Streamlit UI

📝 Read the full article on Medium

Getting Started

Prerequisites

Before running the application, make sure you have the following installed:

  1. Docker & Docker Compose - Required for running the application in containers

    • Install Docker Desktop from docker.com
    • Includes Docker Compose by default
  2. Ollama - For running local LLM models

    • Download from ollama.ai
    • After installation, pull a model: ollama pull mistral (or your preferred model)
    • Ollama will run as a service on http://localhost:11434

Installation & Running Locally

  1. Clone the repository

  2. Ensure Ollama is running:

    ollama serve
    

    (Keep this running in a separate terminal)

  3. Start the application with Docker Compose:

    docker-compose up --build
    
  4. Open your browser and navigate to:

    http://localhost:8501
    

Configuration

  • Modify the database by editing src/seed_data.py if needed
  • Configure model selection and parameters in the application UI
  • Data is stored in the data/ directory

Reviews

No reviews yet

Sign in to write a review