MCP Hub
Back to servers

langfuse-mcp

A comprehensive observability toolkit for Langfuse that allows AI agents to query traces, analyze sessions, manage prompts, and debug exceptions directly through the Model Context Protocol.

Stars
33
Forks
12
Tools
18
Updated
Jan 7, 2026
Validated
Jan 9, 2026

Langfuse MCP Server

Test PyPI version Python 3.10+ License: MIT

A comprehensive Model Context Protocol server for Langfuse observability. Provides 18 tools for AI agents to query traces, debug errors, analyze sessions, and manage prompts.

Quick Start

uvx langfuse-mcp --public-key YOUR_KEY --secret-key YOUR_SECRET --host https://cloud.langfuse.com

Or set environment variables and run without flags:

export LANGFUSE_PUBLIC_KEY=pk-...
export LANGFUSE_SECRET_KEY=sk-...
export LANGFUSE_HOST=https://cloud.langfuse.com
uvx langfuse-mcp

Why langfuse-mcp?

langfuse-mcpOfficial Langfuse MCP
Tools182-4
Traces & ObservationsYesNo
Sessions & UsersYesNo
Exception TrackingYesNo
Prompt ManagementYesYes
LanguagePythonTypeScript
Selective Tool LoadingYesNo

This project provides a full observability toolkit — traces, observations, sessions, exceptions, and prompts — while the official Langfuse MCP focuses on prompt management only.

Available Tools

Traces

ToolDescription
fetch_tracesSearch/filter traces with pagination
fetch_traceFetch a specific trace by ID

Observations

ToolDescription
fetch_observationsSearch/filter observations with pagination
fetch_observationFetch a specific observation by ID

Sessions

ToolDescription
fetch_sessionsList recent sessions with pagination
get_session_detailsGet detailed session info by ID
get_user_sessionsGet all sessions for a user

Exceptions

ToolDescription
find_exceptionsFind exceptions grouped by file/function/type
find_exceptions_in_fileFind exceptions in a specific file
get_exception_detailsGet detailed info about a specific exception
get_error_countGet total error count

Prompts

ToolDescription
get_promptFetch prompt with resolved dependencies
get_prompt_unresolvedFetch prompt with dependency tags intact
list_promptsList/filter prompts with pagination
create_text_promptCreate new text prompt version
create_chat_promptCreate new chat prompt version
update_prompt_labelsUpdate labels for a prompt version

Schema

ToolDescription
get_data_schemaGet schema information for response structures

Installation

Using uvx (recommended)

uvx langfuse-mcp --help

Using pip

pip install langfuse-mcp
langfuse-mcp --help

Using Docker

docker pull ghcr.io/avivsinai/langfuse-mcp:latest
docker run --rm -i \
  -e LANGFUSE_PUBLIC_KEY=pk-... \
  -e LANGFUSE_SECRET_KEY=sk-... \
  -e LANGFUSE_HOST=https://cloud.langfuse.com \
  ghcr.io/avivsinai/langfuse-mcp:latest

Configuration

Claude Code

Create .mcp.json in your project root:

{
  "mcpServers": {
    "langfuse": {
      "command": "uvx",
      "args": ["langfuse-mcp"],
      "env": {
        "LANGFUSE_PUBLIC_KEY": "pk-...",
        "LANGFUSE_SECRET_KEY": "sk-...",
        "LANGFUSE_HOST": "https://cloud.langfuse.com"
      }
    }
  }
}

Codex CLI

Add to ~/.codex/config.toml:

[mcp_servers.langfuse]
command = "uvx"
args = ["langfuse-mcp"]

[mcp_servers.langfuse.env]
LANGFUSE_PUBLIC_KEY = "pk-..."
LANGFUSE_SECRET_KEY = "sk-..."
LANGFUSE_HOST = "https://cloud.langfuse.com"

Or via CLI:

codex mcp add langfuse \
  --env LANGFUSE_PUBLIC_KEY=pk-... \
  --env LANGFUSE_SECRET_KEY=sk-... \
  --env LANGFUSE_HOST=https://cloud.langfuse.com \
  -- uvx langfuse-mcp

Cursor

Create .cursor/mcp.json in your project:

{
  "mcpServers": {
    "langfuse": {
      "command": "uvx",
      "args": ["langfuse-mcp"],
      "env": {
        "LANGFUSE_PUBLIC_KEY": "pk-...",
        "LANGFUSE_SECRET_KEY": "sk-...",
        "LANGFUSE_HOST": "https://cloud.langfuse.com"
      }
    }
  }
}

Or use the deeplink for quick setup:

cursor://anysphere.cursor-deeplink/mcp/install?name=langfuse-mcp&config=eyJjb21tYW5kIjoidXZ4IiwiYXJncyI6WyJsYW5nZnVzZS1tY3AiXX0=

Claude Desktop

Add to Claude Desktop settings:

{
  "mcpServers": {
    "langfuse": {
      "command": "uvx",
      "args": ["langfuse-mcp"],
      "env": {
        "LANGFUSE_PUBLIC_KEY": "pk-...",
        "LANGFUSE_SECRET_KEY": "sk-...",
        "LANGFUSE_HOST": "https://cloud.langfuse.com"
      }
    }
  }
}

Usage

Selective Tool Loading

Load only the tool groups you need to reduce token overhead:

# Load only trace and prompt tools
langfuse-mcp --tools traces,prompts

# Available groups: traces, observations, sessions, exceptions, prompts, schema

Or via environment variable:

export LANGFUSE_MCP_TOOLS=traces,prompts

Output Modes

Each tool supports different output modes:

ModeDescription
compactSummary with large values truncated (default)
full_json_stringComplete data as JSON string
full_json_fileSave to file, return summary with path

Logging

# Debug logging to console
langfuse-mcp --log-level DEBUG --log-to-console

# Custom log file location
export LANGFUSE_MCP_LOG_FILE=/var/log/langfuse_mcp.log

Default log location: /tmp/langfuse_mcp.log

Development

git clone https://github.com/avivsinai/langfuse-mcp.git
cd langfuse-mcp

# Create virtual environment
uv venv --python 3.11 .venv
source .venv/bin/activate

# Install with dev dependencies
uv pip install -e ".[dev]"

# Run tests
pytest

# Lint and format
ruff check --fix . && ruff format .

Version Management

This project uses Git tags for versioning:

  1. Tag: git tag v1.0.0
  2. Push: git push --tags
  3. GitHub Actions builds and publishes to PyPI

See CHANGELOG.md for release history.

Contributing

Contributions welcome! Please submit a Pull Request.

License

MIT License - see LICENSE for details.

Reviews

No reviews yet

Sign in to write a review