MCP Hub
Back to servers

observe-instrument-mcp

Add OpenTelemetry tracing to Python AI agents. Supports LangGraph, LlamaIndex, CrewAI, OpenAI SDK.

Registry
Updated
Mar 13, 2026

Quick Install

uvx observe-instrument-mcp

observe-instrument-mcp

An MCP server that automatically instruments Python AI agents with the ioa-observe-sdk — adding OpenTelemetry-based tracing, metrics, and logs with zero manual effort.

Works with any MCP-compatible AI coding assistant: Claude Desktop, Cursor, Windsurf, and others.

What it does

Two tools:

instrument_agent — reads a Python agent file, applies full observe SDK instrumentation, writes it back, and returns a summary of changes. Creates a .bak backup before modifying.

check_instrumentation — audits a file for missing instrumentation without modifying it.

Supported frameworks: LlamaIndex, LangGraph, CrewAI, raw OpenAI SDK.

Installation

pip install observe-instrument-mcp
# or
uv add observe-instrument-mcp

Requires an API key for your chosen LLM provider. Defaults to Claude (ANTHROPIC_API_KEY). See supported providers below.

Configuration

Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "observe-instrument": {
      "command": "uvx",
      "args": ["observe-instrument-mcp"],
      "env": {
        "ANTHROPIC_API_KEY": "sk-ant-..."
      }
    }
  }
}

Cursor

Add to .cursor/mcp.json in your project:

{
  "mcpServers": {
    "observe-instrument": {
      "command": "uvx",
      "args": ["observe-instrument-mcp"],
      "env": {
        "ANTHROPIC_API_KEY": "sk-ant-..."
      }
    }
  }
}

Windsurf

Add to ~/.codeium/windsurf/mcp_config.json:

{
  "mcpServers": {
    "observe-instrument": {
      "command": "uvx",
      "args": ["observe-instrument-mcp"],
      "env": {
        "ANTHROPIC_API_KEY": "sk-ant-..."
      }
    }
  }
}

Usage

Once configured, ask your AI assistant:

Instrument my agent with the observe SDK: path/to/my_agent.py
Check what observe SDK instrumentation is missing from path/to/my_agent.py

Environment variables

VariableDescription
LLM_MODELModel to use (default: claude-sonnet-4-6). See provider table below.
ANTHROPIC_API_KEYRequired for Anthropic models
OPENAI_API_KEYRequired for OpenAI models
GEMINI_API_KEYRequired for Google Gemini models
GROQ_API_KEYRequired for Groq models

Supported providers

ProviderKey variableLLM_MODEL example
AnthropicANTHROPIC_API_KEYclaude-sonnet-4-6
OpenAIOPENAI_API_KEYgpt-4o
Google GeminiGEMINI_API_KEYgemini/gemini-2.0-flash
GroqGROQ_API_KEYgroq/llama-3.3-70b
Ollama (local, free)noneollama/llama3.2

After instrumentation

Install the SDK in your project:

pip install ioa-observe-sdk
# or
uv add ioa-observe-sdk

Start the observability stack (OTel Collector + ClickHouse):

cd path/to/observe/deploy
docker compose up -d

Run your agent:

OPENAI_API_KEY=sk-... OTLP_HTTP_ENDPOINT=http://localhost:4318 python my_agent.py

Query traces:

docker exec -it clickhouse-server clickhouse-client --user admin --password admin
SELECT SpanName, ServiceName, Duration / 1000000. AS ms, Timestamp
FROM otel_traces
ORDER BY Timestamp DESC
LIMIT 20;

Development

git clone https://github.com/alanzha2/observe-instrument-mcp
cd observe-instrument-mcp
pip install -e .

# Test the server locally
mcp dev observe_instrument_mcp/server.py

License

Apache-2.0

Reviews

No reviews yet

Sign in to write a review