MCP Hub
Back to servers

Databricks MCP Server

263 tools for Databricks: Unity Catalog, SQL, Compute, Jobs, Serving, and more.

Stars
1
Updated
Feb 9, 2026

Quick Install

uvx databricks-sdk-mcp

Databricks MCP Server

PyPI CI Python 3.10+ License: Apache 2.0

A comprehensive Model Context Protocol (MCP) server for Databricks, built on the official Databricks Python SDK.

Provides 263 tools and 8 prompt templates across 28 service domains, giving AI assistants full access to the Databricks platform.

Features

  • SDK-first: Uses databricks-sdk for type safety and automatic API freshness
  • Comprehensive: Covers Unity Catalog, SQL, Compute, Jobs, Pipelines, Serving, Vector Search, Apps, Lakebase, Dashboards, Genie, Secrets, IAM, Connections, Experiments, and Delta Sharing
  • Zero custom auth: Delegates authentication entirely to the SDK (PAT, OAuth, Azure AD, service principal -- all automatic)
  • Selective loading: Include/exclude tool modules via environment variables
  • MCP Resources: Read-only workspace context (URL, current user, auth type)

Quick Start

Installation

pip install databricks-sdk-mcp

Or run with Docker:

docker run -i -e DATABRICKS_HOST=... -e DATABRICKS_TOKEN=... databricks-mcp

Or install from source:

git clone https://github.com/pramodbhatofficial/databricks-mcp-server.git
cd databricks-mcp-server
pip install -e ".[dev]"

Authentication

Authentication is handled by the Databricks SDK. Set one of:

Personal Access Token (simplest):

export DATABRICKS_HOST=https://your-workspace.databricks.com
export DATABRICKS_TOKEN=dapi...

OAuth (M2M):

export DATABRICKS_HOST=https://your-workspace.databricks.com
export DATABRICKS_CLIENT_ID=...
export DATABRICKS_CLIENT_SECRET=...

Other methods: Azure AD, Databricks CLI profile, Azure Managed Identity -- all auto-detected by the SDK.

Running

databricks-mcp

This starts the MCP server using stdio transport.

Integrations

Claude Code (Terminal)

Add to ~/.claude/settings.json or your project's .claude/settings.json:

{
  "mcpServers": {
    "databricks": {
      "command": "databricks-mcp",
      "env": {
        "DATABRICKS_HOST": "https://your-workspace.databricks.com",
        "DATABRICKS_TOKEN": "dapi..."
      }
    }
  }
}

Then restart Claude Code. Verify with /mcp to see the registered tools.

Claude Desktop

Add to your Claude Desktop config file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "databricks": {
      "command": "databricks-mcp",
      "env": {
        "DATABRICKS_HOST": "https://your-workspace.databricks.com",
        "DATABRICKS_TOKEN": "dapi..."
      }
    }
  }
}

Restart Claude Desktop. The Databricks tools will appear in the tool picker.

Cursor

Add to .cursor/mcp.json in your project root (or ~/.cursor/mcp.json for global):

{
  "mcpServers": {
    "databricks": {
      "command": "databricks-mcp",
      "env": {
        "DATABRICKS_HOST": "https://your-workspace.databricks.com",
        "DATABRICKS_TOKEN": "dapi..."
      }
    }
  }
}

Open Cursor Settings > MCP to verify the server is connected.

Windsurf

Add to ~/.codeium/windsurf/mcp_config.json:

{
  "mcpServers": {
    "databricks": {
      "command": "databricks-mcp",
      "env": {
        "DATABRICKS_HOST": "https://your-workspace.databricks.com",
        "DATABRICKS_TOKEN": "dapi..."
      }
    }
  }
}

VS Code (Copilot)

Add to .vscode/mcp.json in your project:

{
  "servers": {
    "databricks": {
      "command": "databricks-mcp",
      "env": {
        "DATABRICKS_HOST": "https://your-workspace.databricks.com",
        "DATABRICKS_TOKEN": "dapi..."
      }
    }
  }
}

Zed

Add to Zed's settings (~/.config/zed/settings.json):

{
  "context_servers": {
    "databricks": {
      "command": {
        "path": "databricks-mcp",
        "env": {
          "DATABRICKS_HOST": "https://your-workspace.databricks.com",
          "DATABRICKS_TOKEN": "dapi..."
        }
      }
    }
  }
}

Any MCP Client (Generic stdio)

The server uses stdio transport. Connect from any MCP-compatible client:

# Set auth env vars
export DATABRICKS_HOST=https://your-workspace.databricks.com
export DATABRICKS_TOKEN=dapi...

# Start the server (communicates via stdin/stdout)
databricks-mcp

Tip: Load Only What You Need

If your MCP client struggles with many tools, use selective loading to reduce the tool count:

{
  "mcpServers": {
    "databricks": {
      "command": "databricks-mcp",
      "env": {
        "DATABRICKS_HOST": "https://your-workspace.databricks.com",
        "DATABRICKS_TOKEN": "dapi...",
        "DATABRICKS_MCP_TOOLS_INCLUDE": "unity_catalog,sql,compute,jobs"
      }
    }
  }
}

Tool Modules

ModuleToolsDescription
unity_catalog23Catalogs, schemas, tables, volumes, functions, registered models
sql14Warehouses, SQL execution, queries, alerts, history
workspace10Notebooks, files, repos
compute18Clusters, instance pools, policies, node types, Spark versions
jobs13Jobs, runs, tasks, repair, cancel all
pipelines8DLT / Lakeflow pipelines
serving10Serving endpoints, model versions, OpenAPI
vector_search10Vector search endpoints, indexes, sync
apps10Databricks Apps lifecycle
database10Lakebase PostgreSQL instances
dashboards9Lakeview AI/BI dashboards, published views
genie5Genie AI/BI conversations
secrets8Secret scopes and secrets
iam16Users, groups, service principals, permissions, current user
connections5External connections
experiments14MLflow experiments, runs, artifacts, metrics, params
sharing11Delta Sharing shares, recipients, providers
files12DBFS and UC Volumes file operations
grants3Unity Catalog permission grants (GRANT/REVOKE)
storage10Storage credentials and external locations
metastores8Unity Catalog metastore management
online_tables3Online tables for low-latency serving
global_init_scripts5Workspace-wide init scripts
tokens5Personal access token management
git_credentials5Git credential management for repos
quality_monitors8Data quality monitoring and refreshes
command_execution4Interactive command execution on clusters
workflows5Composite multi-step operations (workspace status, schema setup, query preview)

Selective Tool Loading

With 263 tools, it's recommended to load only the modules you need. This improves agent performance and tool selection accuracy.

Role-Based Presets (Recommended)

Pick a preset that matches your role:

PresetModulesToolsConfig
Data Engineerunity_catalog, sql, compute, jobs, pipelines, files, quality_monitors~100DATABRICKS_MCP_TOOLS_INCLUDE=unity_catalog,sql,compute,jobs,pipelines,files,quality_monitors
ML Engineerserving, vector_search, experiments, compute, unity_catalog, online_tables, files~98DATABRICKS_MCP_TOOLS_INCLUDE=serving,vector_search,experiments,compute,unity_catalog,online_tables,files
Platform Adminiam, secrets, tokens, metastores, compute, global_init_scripts, grants, storage~85DATABRICKS_MCP_TOOLS_INCLUDE=iam,secrets,tokens,metastores,compute,global_init_scripts,grants,storage
App Developerapps, database, sql, files, serving, secrets~64DATABRICKS_MCP_TOOLS_INCLUDE=apps,database,sql,files,serving,secrets
Data Analystsql, unity_catalog, dashboards, genie, workspace~61DATABRICKS_MCP_TOOLS_INCLUDE=sql,unity_catalog,dashboards,genie,workspace
Minimalsql, unity_catalog~37DATABRICKS_MCP_TOOLS_INCLUDE=sql,unity_catalog

Example using a preset in Claude Code:

{
  "mcpServers": {
    "databricks": {
      "command": "databricks-mcp",
      "env": {
        "DATABRICKS_HOST": "https://your-workspace.databricks.com",
        "DATABRICKS_TOKEN": "dapi...",
        "DATABRICKS_MCP_TOOLS_INCLUDE": "unity_catalog,sql,compute,jobs,pipelines,files,quality_monitors"
      }
    }
  }
}

Custom Filtering

# Only include specific modules
export DATABRICKS_MCP_TOOLS_INCLUDE=unity_catalog,sql,serving

# Exclude specific modules (cannot combine with INCLUDE)
export DATABRICKS_MCP_TOOLS_EXCLUDE=iam,sharing,experiments

If INCLUDE is set, only those modules load. If EXCLUDE is set, everything except those modules loads. INCLUDE takes precedence if both are set.

Tool Discovery (For AI Agents)

The server includes built-in tool discovery to help AI agents find the right tools:

MCP Resources

URIDescription
databricks://workspace/infoWorkspace URL, current user, auth type
databricks://tools/guideTool catalog with module descriptions, use cases, and role presets

Agents can read databricks://tools/guide at connection time to understand what's available.

Discovery Tool

The databricks_tool_guide tool helps agents find the right tools during a conversation:

# Find tools for a specific task
databricks_tool_guide(task="run a SQL query")
databricks_tool_guide(task="deploy an ML model")
databricks_tool_guide(task="create a user")

# Get role-based recommendations
databricks_tool_guide(role="data_engineer")
databricks_tool_guide(role="ml_engineer")

This returns matching modules with descriptions and usage hints, so the agent knows exactly which databricks_* tools to call.

MCP Prompts (Guided Workflows)

The server includes 8 prompt templates that guide AI agents through multi-step Databricks workflows:

PromptDescription
explore_data_catalogBrowse Unity Catalog structure (catalogs → schemas → tables)
query_dataFind a warehouse, execute SQL, and format results
debug_failing_jobInvestigate a failing job: status, logs, error analysis
setup_ml_experimentCreate an MLflow experiment and configure tracking
deploy_modelDeploy a model to a serving endpoint
setup_data_pipelineCreate a DLT pipeline with scheduling
workspace_health_checkAudit clusters, warehouses, jobs, and endpoints
manage_permissionsReview and update permissions on workspace objects

Prompts appear automatically in MCP clients that support them (e.g., Claude Desktop's prompt picker).

Docker

Run the MCP server in a container:

# Build
docker build -t databricks-mcp .

# Run with stdio
docker run -i \
  -e DATABRICKS_HOST=https://your-workspace.databricks.com \
  -e DATABRICKS_TOKEN=dapi... \
  databricks-mcp

# Run with SSE transport
docker run -p 8080:8080 \
  -e DATABRICKS_HOST=https://your-workspace.databricks.com \
  -e DATABRICKS_TOKEN=dapi... \
  databricks-mcp --transport sse --port 8080

# Run with selective modules
docker run -i \
  -e DATABRICKS_HOST=https://your-workspace.databricks.com \
  -e DATABRICKS_TOKEN=dapi... \
  -e DATABRICKS_MCP_TOOLS_INCLUDE=sql,unity_catalog \
  databricks-mcp

SSE Transport (Remote Server)

The server supports SSE transport for remote connections:

# Start as SSE server
databricks-mcp --transport sse --port 8080

# Custom host/port
databricks-mcp --transport sse --host 127.0.0.1 --port 3000

Connect from any MCP client that supports SSE:

{
  "mcpServers": {
    "databricks": {
      "url": "http://localhost:8080/sse"
    }
  }
}

Development

# Install with dev dependencies
pip install -e ".[dev]"

# Lint
ruff check databricks_mcp/

# Test
pytest tests/ -v

Author

Pramod Bhat

License

Apache 2.0 -- see LICENSE.

Reviews

No reviews yet

Sign in to write a review