MCP Hub
Back to servers

Queue Pilot

Requires Setup

MCP server for RabbitMQ and Kafka message inspection with JSON Schema validation

Registrynpm60/wk
Stars
2
Updated
Feb 14, 2026
Validated
Feb 26, 2026

Quick Install

npx -y queue-pilot

Queue Pilot

npm version license CI npm downloads node TypeScript

MCP server for message queue development — combines message inspection with JSON Schema validation. Supports RabbitMQ and Kafka.

Designed for integration projects where multiple teams communicate via message brokers: inspect queues/topics, view messages, and validate payloads against agreed-upon schemas — all from your AI assistant.

Queue Pilot architecture: MCP clients connect to Queue Pilot, which interfaces with RabbitMQ and Kafka

Features

  • Multi-broker support — RabbitMQ and Apache Kafka via a unified adapter interface
  • Message Inspection — Browse queues/topics, peek at messages without consuming them
  • Schema Validation — Validate message payloads against JSON Schema definitions
  • Combined Inspectioninspect_queue peeks messages AND validates each against its schema
  • Validated Publishingpublish_message validates against a schema before sending — invalid messages never hit the broker
  • Queue Management — Create queues/topics, bindings, and purge messages for dev/test workflows
  • Broker Info — List exchanges, bindings, consumer groups, and partition details

Prerequisites

  • Node.js >= 22 — Required runtime (check with node --version)
  • A message broker:
    • RabbitMQ with the management plugin enabled (HTTP API on port 15672), or
    • Apache Kafka (requires @confluentinc/kafka-javascript as peer dependency)
  • An MCP-compatible client — Claude Code, Claude Desktop, Cursor, VS Code (Copilot), Windsurf, etc.

Quick Start

1. Define your schemas

Create JSON Schema files in a directory:

schemas/order.created.json:

{
  "$id": "order.created",
  "$schema": "http://json-schema.org/draft-07/schema#",
  "title": "Order Created",
  "description": "Emitted when a new order is placed",
  "version": "1.0.0",
  "type": "object",
  "required": ["orderId", "amount"],
  "properties": {
    "orderId": { "type": "string" },
    "amount": { "type": "number" }
  }
}

2. Add to your MCP client

Generate the config for your client with queue-pilot init:

npx queue-pilot init --schemas /absolute/path/to/your/schemas --client <name>

Supported clients: claude-code, claude-desktop, vscode, cursor, windsurf. Omit --client for generic JSON.

For Kafka, add --broker kafka. The generated config automatically includes the required @confluentinc/kafka-javascript peer dependency.

Non-default credentials are included as environment variables to avoid exposing secrets in ps output:

npx queue-pilot init --schemas ./schemas --rabbitmq-user admin --rabbitmq-pass secret

Run npx queue-pilot init --help for all options including Kafka SASL authentication.

Windows note: If npx fails to resolve the package, try cmd /c npx queue-pilot init ....

Manual configuration (without init)

Add the following server configuration to your MCP client:

RabbitMQ:

{
  "mcpServers": {
    "queue-pilot": {
      "command": "npx",
      "args": [
        "-y",
        "queue-pilot",
        "--schemas", "/absolute/path/to/your/schemas"
      ]
    }
  }
}

Kafka:

{
  "mcpServers": {
    "queue-pilot": {
      "command": "npx",
      "args": [
        "-y",
        "--package=@confluentinc/kafka-javascript",
        "--package=queue-pilot",
        "queue-pilot",
        "--schemas", "/absolute/path/to/your/schemas",
        "--broker", "kafka"
      ],
      "env": {
        "KAFKA_BROKERS": "localhost:9092"
      }
    }
  }
}

Schema path tip: Use an absolute path for --schemas. Relative paths resolve from the MCP client's working directory, which may not be your project root.

ClientConfig file
Claude Code.mcp.json (project) or ~/.claude.json (user)
Claude Desktopclaude_desktop_config.json
Cursor.cursor/mcp.json
VS Code (Copilot).vscode/mcp.json (uses "servers" instead of "mcpServers")
Windsurf~/.codeium/windsurf/mcp_config.json
Development (running from source)
{
  "mcpServers": {
    "queue-pilot": {
      "command": "npx",
      "args": [
        "tsx",
        "src/index.ts",
        "--schemas", "./schemas"
      ],
      "cwd": "/path/to/queue-pilot"
    }
  }
}

3. Use it

Ask your assistant things like:

  • "Which queues are there and how many messages do they have?"
  • "Show me the messages in the orders queue"
  • "Inspect the registration queue and check if all messages are valid"
  • "What schemas are available?"
  • "Validate this message against the order.created schema"
  • "Publish an order.created event to the events exchange"
  • "Create a queue called dead-letters and bind it to the events exchange"
  • "Purge all messages from the orders queue"
  • "List all consumer groups" (Kafka)
  • "Show me the partition details for the orders topic" (Kafka)
MCP Tools

Universal tools (all brokers)

ToolDescription
list_schemasList all loaded message schemas
get_schemaGet the full definition of a specific schema
validate_messageValidate a JSON message against a schema
list_queuesList all queues/topics with message counts
peek_messagesView messages in a queue/topic without consuming them
inspect_queuePeek messages + validate each against its schema
get_overviewGet broker cluster overview
check_healthCheck broker health status
get_queueGet detailed information about a specific queue/topic
list_consumersList consumers (RabbitMQ) or consumer groups (Kafka)
publish_messagePublish a message with optional schema validation gate
purge_queueRemove all messages from a queue/topic
create_queueCreate a new queue/topic
delete_queueDelete a queue/topic

RabbitMQ-specific tools

ToolDescription
list_exchangesList all RabbitMQ exchanges
create_exchangeCreate a new exchange
delete_exchangeDelete an exchange
list_bindingsList bindings between exchanges and queues
create_bindingBind a queue to an exchange with a routing key
delete_bindingDelete a binding
list_connectionsList all client connections to the broker

Kafka-specific tools

ToolDescription
list_consumer_groupsList all consumer groups with their state
describe_consumer_groupShow members, assignments, and state of a consumer group
list_partitionsShow partition details for a topic (leader, replicas, ISR)
get_offsetsShow earliest/latest offsets per partition
MCP Prompts & Resources

Prompts

Pre-built workflow templates that guide your AI assistant through multi-step operations.

PromptParametersDescription
debug-flowexchange, queueTrace bindings from exchange to queue, peek messages, and validate each against its schema
health-report(none)Check broker health, get cluster overview, flag queues with backed-up messages
schema-compliancequeue (optional)Peek messages and validate each against its schema — for one queue or all queues

Usage example (in any MCP-compatible client):

"Use the debug-flow prompt for exchange 'events' and queue 'orders'"

Resources

Each loaded schema is exposed as a readable MCP resource at schema:///<schema-name>.

Clients that support MCP resources can read schema definitions directly without calling tools. For example, a schema loaded from order.created.json is available at schema:///order.created.

Schema Format

Schemas follow JSON Schema draft-07 with a few conventions:

  • $id — Message type identifier (matches the type property on messages)
  • version — Schema version (custom field, not validated by JSON Schema)
  • Standard JSON Schema validation including required, properties, format etc.

Schema matching: when inspecting a queue, the message's type property is used to find the corresponding schema by $id.

Configuration

CLI arguments take priority over environment variables, which take priority over defaults.

SettingCLI flagEnv varDefault
Schema directory--schemas(required)
Broker type--brokerrabbitmq
RabbitMQ URL--rabbitmq-urlRABBITMQ_URLhttp://localhost:15672
RabbitMQ user--rabbitmq-userRABBITMQ_USERguest
RabbitMQ password--rabbitmq-passRABBITMQ_PASSguest
Kafka brokers--kafka-brokersKAFKA_BROKERSlocalhost:9092
Kafka client ID--kafka-client-idKAFKA_CLIENT_IDqueue-pilot
SASL mechanism--kafka-sasl-mechanismKAFKA_SASL_MECHANISM(none)
SASL username--kafka-sasl-usernameKAFKA_SASL_USERNAME(none)
SASL password--kafka-sasl-passwordKAFKA_SASL_PASSWORD(none)

Use environment variables in MCP client env blocks to avoid exposing credentials in ps output.

Development

npm install
npm test                    # Unit tests
npm run test:coverage       # Coverage report
npm run build               # TypeScript compilation
npm run typecheck           # Type check

# Integration tests (requires RabbitMQ)
docker compose up -d --wait
npm run test:integration

Tech Stack

License

MIT

Reviews

No reviews yet

Sign in to write a review