MCP Hub
Back to servers

fastbcp-mcp

MCP server for FastBCP — high-performance parallel database export to files and cloud

Updated
Feb 23, 2026

Quick Install

uvx fastbcp-mcp

FastBCP MCP Server

A Model Context Protocol (MCP) server that exposes FastBCP functionality for exporting data from databases to files (CSV, TSV, JSON, BSON, Parquet, XLSX, Binary) with optional cloud storage targets.

Overview

FastBCP is a high-performance CLI tool for exporting data from databases to files. This MCP server wraps FastBCP functionality and provides:

  • Safety-first approach: Preview commands before execution with user confirmation required
  • Password masking: Credentials and connection strings are never displayed in logs or output
  • Intelligent validation: Parameter validation with database-specific compatibility checks
  • Smart suggestions: Automatic parallelism method recommendations
  • Version detection: Automatic binary version detection with capability registry
  • Comprehensive logging: Full execution logs with timestamps and results

MCP Tools

1. preview_export_command

Build and preview a FastBCP export command WITHOUT executing it. Shows the exact command with passwords masked. Always use this first.

2. execute_export

Execute a previously previewed command. Requires confirmation: true as a safety mechanism.

3. validate_connection

Validate source database connection parameters (parameter check only, does not test actual connectivity).

4. list_supported_formats

List all supported source databases, output formats, and storage targets.

5. suggest_parallelism_method

Recommend the optimal parallelism method based on source database type and table characteristics.

6. get_version

Report the detected FastBCP binary version, supported types, and feature flags.

Installation

Prerequisites

  • Python 3.10 or higher
  • FastBCP binary v0.29+ (obtain from Arpe.io)
  • Claude Code or another MCP client

Setup

  1. Clone or download this repository:

    cd /path/to/fastbcp-mcp
    
  2. Install Python dependencies:

    pip install -r requirements.txt
    
  3. Configure environment:

    cp .env.example .env
    # Edit .env with your FastBCP path
    
  4. Add to Claude Code configuration (~/.claude.json):

    {
      "mcpServers": {
        "fastbcp": {
          "type": "stdio",
          "command": "python",
          "args": ["/absolute/path/to/fastbcp-mcp/src/server.py"],
          "env": {
            "FASTBCP_PATH": "/absolute/path/to/FastBCP"
          }
        }
      }
    }
    
  5. Restart Claude Code to load the MCP server.

  6. Verify installation:

    # In Claude Code, run:
    /mcp
    # You should see "fastbcp: connected"
    

Configuration

Environment Variables

Edit .env to configure:

# Path to FastBCP binary (required)
FASTBCP_PATH=./fastbcp/FastBCP

# Execution timeout in seconds (default: 1800 = 30 minutes)
FASTBCP_TIMEOUT=1800

# Log directory (default: ./logs)
FASTBCP_LOG_DIR=./logs

# Log level (default: INFO)
LOG_LEVEL=INFO

Connection Options

The server supports multiple ways to authenticate and connect:

ParameterDescription
serverHost:port or host\instance (optional with connect_string or dsn)
user / passwordStandard credentials
trusted_authWindows trusted authentication
connect_stringFull connection string (excludes server/user/password/dsn)
dsnODBC DSN name (excludes server/provider)
providerOleDB provider name
application_intentSQL Server application intent (ReadOnly/ReadWrite)

Output Options

OptionCLI FlagDescription
format--formatOutput format: csv, tsv, json, bson, parquet, xlsx, binary
file_output--fileoutputOutput file path
directory--directoryOutput directory path
storage_target--storagetargetStorage: local, s3, s3compatible, azure_blob, azure_datalake, fabric_onelake
delimiter--delimiterField delimiter (CSV/TSV)
quotes--quotesQuote character
encoding--encodingOutput encoding
no_header--noheaderOmit header row (CSV/TSV)
decimal_separator--decimalseparatorDecimal separator (. or ,)
date_format--dateformatDate format string
bool_format--boolformatBoolean format: TrueFalse, OneZero, YesNo
parquet_compression--parquetcompressionParquet compression: None, Snappy, Gzip, Lz4, Lzo, Zstd
timestamped--timestampedAdd timestamp to output filename
merge--mergeMerge parallel output files

Export Options

OptionCLI FlagDescription
method--methodParallelism method
distribute_key_column--distributeKeyColumnColumn for data distribution
degree--degreeParallelism degree (default: 1)
load_mode--loadmodeAppend or Truncate
batch_size--batchsizeBatch size for export operations
map_method--mapmethodColumn mapping: Position or Name
run_id--runidRun ID for logging
data_driven_query--datadrivenqueryCustom SQL for DataDriven method
settings_file--settingsfileCustom settings JSON file
log_level--loglevelOverride log level (Information/Debug)
no_banner--nobannerSuppress banner output
license_path--licenseLicense file path or URL
cloud_profile--cloudprofileCloud storage profile name

Usage Examples

PostgreSQL to CSV Export

User: "Export the 'orders' table from PostgreSQL (localhost:5432, database: sales_db,
       schema: public) to CSV file at /tmp/orders.csv. Use parallel export."

Claude Code will:
1. Call suggest_parallelism_method to recommend Ctid for PostgreSQL
2. Call preview_export_command with your parameters
3. Show the command with masked passwords
4. Explain what will happen
5. Ask for confirmation
6. Execute with execute_export when you approve

Export to Parquet with Compression

User: "Export the 'transactions' table from SQL Server to Parquet format
       with Snappy compression, saved to /data/exports/."

Claude Code will use parquet format with parquet_compression set to Snappy.

Export to S3

User: "Export the 'users' table from PostgreSQL to CSV on S3 bucket
       s3://my-bucket/exports/ using my AWS profile."

Claude Code will use storage_target=s3 with cloud_profile.

Check Version and Capabilities

User: "What version of FastBCP is installed?"

Claude Code will call get_version and display the detected version,
supported source types, output formats, and available features.

Two-Step Safety Process

This server implements a mandatory two-step process:

  1. Preview - Always use preview_export_command first
  2. Execute - Use execute_export with confirmation: true

You cannot execute without previewing first and confirming.

Security

  • Passwords and connection strings are masked in all output and logs
  • Sensitive flags masked: --sourcepassword, --sourceconnectstring, -x, -g
  • Use environment variables for sensitive configuration
  • Review commands carefully before executing
  • Use minimum required database permissions

Testing

Run the test suite:

# Run all tests
python -m pytest tests/ -v

# Run with coverage
python -m pytest tests/ --cov=src --cov-report=html

Project Structure

fastbcp-mcp/
  src/
    __init__.py
    server.py          # MCP server (tool definitions, handlers)
    fastbcp.py         # Command builder, executor, suggestions
    validators.py      # Pydantic models, enums, validation
    version.py         # Version detection and capabilities registry
  tests/
    __init__.py
    test_command_builder.py
    test_validators.py
    test_version.py
  .env.example
  requirements.txt
  CHANGELOG.md
  README.md

License

This MCP server wrapper is provided as-is. FastBCP itself is a separate product from Arpe.io.

Related Links

Reviews

No reviews yet

Sign in to write a review