MCP Hub
Back to servers

WakaTime MCP Server

A Model Context Protocol server that enables AI models to query WakaTime coding analytics, providing high-signal data on developer productivity, project timelines, and activity summaries.

Tools
5
Updated
Jan 8, 2026
Validated
Jan 9, 2026

WakaTime MCP Server

A Model Context Protocol (MCP) server that provides high-signal coding analytics from your WakaTime data.

  • Direct mode: FastMCP serves Streamable HTTP at http://localhost:8000/mcp
  • Proxy mode: mcp-proxy exposes the server over SSE/HTTP and Caddy adds token auth (recommended for self-hosting)

Tooling / API

ToolPurposeKey arguments
get_coding_statsDetailed stats for a periodrange (last_7_days, last_30_days, last_6_months, last_year, all_time)
get_summaryActivity breakdown for a date/rangestart_date, end_date, project
get_all_timeTotal coding time since account creationproject (optional)
get_status_barCurrent day status (like editor status bar)(none)
list_projectsList/search tracked projectsquery (optional)

Configuration

Configure via environment variables (or a .env file for the self-hosted scripts).

VariableDescriptionRequired
WAKATIME_API_KEYYour API key from https://wakatime.com/settings/api-keyYes
MCP_AUTH_KEYToken for auth proxy (proxy/self-hosted mode)Proxy mode
PORTDirect-mode port (default: 8000)No

Development (Direct mode)

  1. Install (uv required)

    git clone https://github.com/dpshade/wakatime-mcp.git
    cd wakatime-mcp
    
    uv sync --no-install-project
    
  2. Run

    WAKATIME_API_KEY="your_wakatime_api_key_here" uv run -- python src/server.py
    
  3. Connect

    • URL: http://localhost:8000/mcp
    • Auth: none

Deployment

Option 1: Self-hosted with auth (recommended)

This mode runs the MCP server with FastMCP’s default stdio transport and uses mcp-proxy to expose it over HTTP:

Internet -> (optional Tailscale Funnel) -> Caddy (auth) -> mcp-proxy -> FastMCP (stdio)
                                   :8770         :8767
  1. Configure

    cp .env.example .env
    # Edit .env: set WAKATIME_API_KEY and a strong MCP_AUTH_KEY
    
  2. Download Caddy (auth proxy)

    curl -L https://github.com/caddyserver/caddy/releases/latest/download/caddy_linux_amd64 -o deploy/caddy
    chmod +x deploy/caddy
    
  3. Start

    ./deploy/start.sh
    
  4. Endpoints

    • Auth proxy (recommended):
      • SSE: http://localhost:8770/sse
      • Streamable HTTP: http://localhost:8770/mcp
    • Internal (no auth; do not expose publicly):
      • SSE: http://localhost:8767/sse
      • Streamable HTTP: http://localhost:8767/mcp

mcp-proxy also exposes a health endpoint at http://localhost:8767/status (and via auth proxy at http://localhost:8770/status).

Systemd (persistent)

./deploy/install-systemd.sh
sudo systemctl enable --now mcp-wakatime mcp-wakatime-auth

Optional: Tailscale Funnel

If you use Tailscale, you can publish the auth proxy port:

tailscale funnel --bg --set-path=/wakatime localhost:8770
tailscale funnel --bg 443 on

Option 2: Docker

Runs mcp-proxy + the server in a container.

cd deploy
docker-compose up -d
  • Endpoint (no auth): http://localhost:8767/sse
  • If you want auth, run Caddy on the host (or add it to your own compose stack) and proxy to 8767.

Option 3: Render

This repo includes render.yaml for deploying the direct Python server.

  • Set environment variable: WAKATIME_API_KEY
  • Your service endpoint will be: https://<your-service>/mcp

Client setup

MCP Inspector

npx @modelcontextprotocol/inspector

Then connect using:

  • Direct mode: http://localhost:8000/mcp (Streamable HTTP)
  • Proxy mode: http://localhost:8770/sse (SSE)

Poke / other hosted clients (proxy mode)

Use the auth proxy SSE endpoint and send MCP_AUTH_KEY via one of:

  • Authorization: Bearer <MCP_AUTH_KEY>
  • X-API-Key: <MCP_AUTH_KEY>
  • Api-Key: <MCP_AUTH_KEY>

Security notes

  • Generate a strong auth key:
    openssl rand -hex 32
    
  • Never expose the unauthenticated mcp-proxy port (8767) to the public internet.

License

MIT

Reviews

No reviews yet

Sign in to write a review