MCP Hub
Back to servers

astro-airflow-mcp

A comprehensive MCP server for Apache Airflow (versions 2.x and 3.x) that provides AI assistants with tools for DAG management, task execution logs, and system health diagnostics.

Stars
4
Forks
2
Tools
28
Updated
Jan 13, 2026
Validated
Jan 15, 2026

Airflow MCP Server

CI Python 3.10+ PyPI - Version License: Apache 2.0

A Model Context Protocol (MCP) server for Apache Airflow that provides AI assistants with access to Airflow's REST API. Built with FastMCP.

Quickstart

IDEs

Install in VS Code Add to Cursor

Manual configuration

Add to your MCP settings (Cursor: ~/.cursor/mcp.json, VS Code: .vscode/mcp.json):

{
  "mcpServers": {
    "airflow": {
      "command": "uvx",
      "args": ["astro-airflow-mcp", "--transport", "stdio"]
    }
  }
}

CLI Tools

Claude Code
claude mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Gemini CLI
gemini mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Codex CLI
codex mcp add airflow -- uvx astro-airflow-mcp --transport stdio

Desktop Apps

Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):

{
  "mcpServers": {
    "airflow": {
      "command": "uvx",
      "args": ["astro-airflow-mcp", "--transport", "stdio"]
    }
  }
}

Other MCP Clients

Manual JSON Configuration

Add to your MCP configuration file:

{
  "mcpServers": {
    "airflow": {
      "command": "uvx",
      "args": ["astro-airflow-mcp", "--transport", "stdio"]
    }
  }
}

Or connect to a running HTTP server: "url": "http://localhost:8000/mcp"

Note: No installation required - uvx runs directly from PyPI. The --transport stdio flag is required because the server defaults to HTTP mode.

Configuration

By default, the server connects to http://localhost:8080 (Astro CLI default). Set environment variables for custom Airflow instances:

VariableDescription
AIRFLOW_API_URLAirflow webserver URL
AIRFLOW_USERNAMEUsername (Airflow 3.x uses OAuth2 token exchange)
AIRFLOW_PASSWORDPassword
AIRFLOW_AUTH_TOKENBearer token (alternative to username/password)

Example with auth (Claude Code):

claude mcp add airflow -e AIRFLOW_API_URL=https://your-airflow.example.com -e AIRFLOW_USERNAME=admin -e AIRFLOW_PASSWORD=admin -- uvx astro-airflow-mcp --transport stdio

Features

  • Airflow 2.x and 3.x Support: Automatic version detection with adapter pattern
  • MCP Tools for accessing Airflow data:
    • DAG management (list, get details, get source code, stats, warnings, import errors, trigger, pause/unpause)
    • Task management (list, get details, get task instances, get logs)
    • Pool management (list, get details)
    • Variable management (list, get specific variables)
    • Connection management (list connections with credentials excluded)
    • Asset/Dataset management (unified naming across versions, data lineage)
    • Plugin and provider information
    • Configuration and version details
  • Consolidated Tools for agent workflows:
    • explore_dag: Get comprehensive DAG information in one call
    • diagnose_dag_run: Debug failed DAG runs with task instance details
    • get_system_health: System overview with health, errors, and warnings
  • MCP Resources: Static Airflow info exposed as resources (version, providers, plugins, config)
  • MCP Prompts: Guided workflows for common tasks (troubleshooting, health checks, onboarding)
  • Dual deployment modes:
    • Standalone server: Run as an independent MCP server
    • Airflow plugin: Integrate directly into Airflow 3.x webserver
  • Flexible Authentication:
    • Bearer token (Airflow 2.x and 3.x)
    • Username/password with automatic OAuth2 token exchange (Airflow 3.x)
    • Basic auth (Airflow 2.x)

Available Tools

Consolidated Tools (Agent-Optimized)

ToolDescription
explore_dagGet comprehensive DAG info: metadata, tasks, recent runs, source code
diagnose_dag_runDebug a DAG run: run details, failed task instances, logs
get_system_healthSystem overview: health status, import errors, warnings, DAG stats

Core Tools

ToolDescription
list_dagsGet all DAGs and their metadata
get_dag_detailsGet detailed info about a specific DAG
get_dag_sourceGet the source code of a DAG
get_dag_statsGet DAG run statistics (Airflow 3.x only)
list_dag_warningsGet DAG import warnings
list_import_errorsGet import errors from DAG files that failed to parse
list_dag_runsGet DAG run history
get_dag_runGet specific DAG run details
trigger_dagTrigger a new DAG run (start a workflow execution)
pause_dagPause a DAG to prevent new scheduled runs
unpause_dagUnpause a DAG to resume scheduled runs
list_tasksGet all tasks in a DAG
get_taskGet details about a specific task
get_task_instanceGet task instance execution details
get_task_logsGet logs for a specific task instance execution
list_poolsGet all resource pools
get_poolGet details about a specific pool
list_variablesGet all Airflow variables
get_variableGet a specific variable by key
list_connectionsGet all connections (credentials excluded for security)
list_assetsGet assets/datasets (unified naming across versions)
list_pluginsGet installed Airflow plugins
list_providersGet installed provider packages
get_airflow_configGet Airflow configuration
get_airflow_versionGet Airflow version information

MCP Resources

Resource URIDescription
airflow://versionAirflow version information
airflow://providersInstalled provider packages
airflow://pluginsInstalled Airflow plugins
airflow://configAirflow configuration

MCP Prompts

PromptDescription
troubleshoot_failed_dagGuided workflow for diagnosing DAG failures
daily_health_checkMorning health check routine
onboard_new_dagGuide for understanding a new DAG

Advanced Usage

Running as Standalone Server

For HTTP-based integrations or connecting multiple clients to one server:

# Run server (HTTP mode is default)
uvx astro-airflow-mcp --airflow-url https://my-airflow.example.com --username admin --password admin

Connect MCP clients to: http://localhost:8000/mcp

Airflow Plugin Mode

Install into your Airflow 3.x environment to expose MCP at http://your-airflow:8080/mcp/v1:

# Add to your Astro project
echo astro-airflow-mcp >> requirements.txt

CLI Options

FlagEnvironment VariableDefaultDescription
--transportMCP_TRANSPORTstdioTransport mode (stdio or http)
--hostMCP_HOSTlocalhostHost to bind to (HTTP mode only)
--portMCP_PORT8000Port to bind to (HTTP mode only)
--airflow-urlAIRFLOW_API_URLhttp://localhost:8080Airflow webserver URL
--auth-tokenAIRFLOW_AUTH_TOKENNoneBearer token for authentication
--usernameAIRFLOW_USERNAMENoneUsername for authentication (Airflow 3.x uses OAuth2 token exchange)
--passwordAIRFLOW_PASSWORDNonePassword for authentication

Architecture

The server is built using FastMCP with an adapter pattern for Airflow version compatibility:

Core Components

  • Adapters (adapters/): Version-specific API implementations
    • AirflowAdapter (base): Abstract interface for all Airflow API operations
    • AirflowV2Adapter: Airflow 2.x API (/api/v1) with basic auth
    • AirflowV3Adapter: Airflow 3.x API (/api/v2) with OAuth2 token exchange
  • Version Detection: Automatic detection at startup by probing API endpoints
  • Models (models.py): Pydantic models for type-safe API responses

Version Handling Strategy

  1. Major versions (2.x vs 3.x): Adapter pattern with runtime version detection
  2. Minor versions (3.1 vs 3.2): Runtime feature detection with graceful fallbacks
  3. New API parameters: Pass-through **kwargs for forward compatibility

Deployment Modes

  • Standalone: Independent ASGI application with HTTP/SSE transport
  • Plugin: Mounted into Airflow 3.x FastAPI webserver

Development

# Setup development environment
make install-dev

# Run tests
make test

# Run all checks
make check

# Local testing with Astro CLI
astro dev start  # Start Airflow
make run         # Run MCP server (connects to localhost:8080)

Contributing

Contributions welcome! Please ensure:

  • All tests pass (make test)
  • Code passes linting (make check)
  • prek hooks pass (make prek)

Reviews

No reviews yet

Sign in to write a review