MCP Hub
Back to servers

lc2mcp

A lightweight adapter that converts existing LangChain tools into FastMCP tools, granting access to the vast LangChain tool ecosystem for MCP-compatible clients like Claude and Cursor.

Stars
13
Tools
5
Updated
Jan 22, 2026
Validated
Jan 24, 2026

lc2mcp

PyPI version Python versions License: MIT

Convert LangChain tools to FastMCP tools — in one line of code.

Stop rewriting your tools. Just adapt them.

lc2mcp is a lightweight adapter that converts existing LangChain tools into FastMCP tools, enabling you to quickly build MCP servers accessible to Claude, Cursor, and any MCP-compatible client.


✨ Features

FeatureDescription
🔄 Instant ConversionOne function call to convert any LangChain tool to FastMCP tool
📦 Ecosystem AccessUnlock 1000+ LangChain community tools (Search, Wikipedia, SQL, APIs...)
🎯 Zero BoilerplateAutomatic Pydantic → JSON Schema conversion
🔐 Context InjectionPass auth, user info, and request context to tools
📊 Progress & LoggingFull support for MCP progress notifications and logging
🏷️ Namespace SupportPrefix tool names and handle conflicts automatically

🚀 Quick Start

Installation

pip install lc2mcp

3 Lines to MCP

from langchain_core.tools import tool
from fastmcp import FastMCP
from lc2mcp import register_tools

@tool
def get_weather(city: str) -> str:
    """Get current weather for a city."""
    return f"Sunny, 25°C in {city}"

mcp = FastMCP("weather-server")
register_tools(mcp, [get_weather])  # ← That's it!

if __name__ == "__main__":
    mcp.run()

Your tool is now available to Claude, Cursor, and any MCP client.


🔌 How It Works

┌─────────────────┐      ┌─────────────┐      ┌─────────────────┐
│  LangChain Tool │ ───▶ │   lc2mcp    │ ───▶ │  FastMCP Tool   │
│  (@tool, etc.)  │      │  (adapter)  │      │                 │
└─────────────────┘      └─────────────┘      └────────┬────────┘
                                                       │
                                                       ▼
                                              ┌─────────────────┐
                                              │  FastMCP Server │
                                              └────────┬────────┘
                                                       │
                                                       ▼
                                          ┌───────────────────────┐
                                          │      MCP Clients      │
                                          │ (Claude, Cursor, ...) │
                                          └───────────────────────┘

🔄 lc2mcp vs langchain-mcp-adapters

LangChain and MCP ecosystems can be connected in both directions:

DirectionToolDescription
LangChain → MCPlc2mcpConvert LangChain tools to MCP tools (this project)
MCP → LangChainlangchain-mcp-adaptersConvert MCP tools to LangChain tools (official)

When to use lc2mcp:

  • You have existing LangChain tools and want to expose them via MCP
  • You want to build an MCP server using LangChain's rich tool ecosystem
  • You need to serve tools to Claude, Cursor, or other MCP clients

When to use langchain-mcp-adapters:

  • You have MCP servers and want to use them in LangChain agents
  • You want to call MCP tools from LangGraph workflows

Using both together:

┌─────────────────────────────────────────────────────────────────┐
│                        Your Application                         │
├─────────────────────────────────────────────────────────────────┤
│                                                                 │
│   LangChain Tools ──── lc2mcp ────▶ MCP Server ──▶ MCP Clients │
│         │                              │          (Claude, etc) │
│         │                              │                        │
│         ▼                              ▼                        │
│   LangChain Agent ◀── langchain-mcp-adapters ─── MCP Tools     │
│                                                                 │
└─────────────────────────────────────────────────────────────────┘

Both libraries are complementary and can be used together to build powerful AI applications that bridge the LangChain and MCP ecosystems.


📚 Examples

Using Community Tools

Instantly expose DuckDuckGo search and Wikipedia to MCP clients:

pip install lc2mcp langchain-community duckduckgo-search wikipedia
from fastmcp import FastMCP
from langchain_community.tools import DuckDuckGoSearchRun, WikipediaQueryRun
from langchain_community.utilities import WikipediaAPIWrapper
from lc2mcp import register_tools

mcp = FastMCP("knowledge-server")

register_tools(mcp, [
    DuckDuckGoSearchRun(),
    WikipediaQueryRun(api_wrapper=WikipediaAPIWrapper()),
])

if __name__ == "__main__":
    mcp.run()

With Authentication Context

Inject user authentication and app context into your tools:

from dataclasses import dataclass
from fastmcp import Context, FastMCP
from langchain_core.tools import tool
from langgraph.prebuilt import ToolRuntime
from lc2mcp import register_tools

@dataclass(frozen=True)
class UserContext:
    user_id: str
    tenant_id: str

@tool
def whoami(runtime: ToolRuntime[UserContext]) -> str:
    """Return the current user."""
    return f"Hello, user {runtime.context.user_id} from {runtime.context.tenant_id}"

def runtime_adapter(mcp_ctx: Context) -> ToolRuntime[UserContext]:
    return ToolRuntime(
        context=UserContext(
            user_id=mcp_ctx.get_state("user_id") or "anonymous",
            tenant_id=mcp_ctx.get_state("tenant_id") or "default",
        ),
        state={}, config={}, stream_writer=lambda x: None,
        tool_call_id=None, store=None,
    )

mcp = FastMCP("auth-server")
register_tools(mcp, [whoami], runtime_adapter=runtime_adapter)

if __name__ == "__main__":
    mcp.run()

With Progress Reporting & Logging

Use MCP context for real-time progress updates and logging:

from fastmcp import Context, FastMCP
from langchain_core.tools import tool
from lc2mcp import register_tools

@tool
async def process_data(data: str, mcp_ctx: Context) -> str:
    """Process data with progress reporting."""
    await mcp_ctx.info(f"Starting: {data}")
    await mcp_ctx.report_progress(0, 100, "Starting")
    
    # ... processing steps ...
    await mcp_ctx.report_progress(50, 100, "Processing")
    
    await mcp_ctx.info("Complete!")
    await mcp_ctx.report_progress(100, 100, "Done")
    return f"Processed: {data}"

mcp = FastMCP("processor")
register_tools(mcp, [process_data], inject_mcp_ctx=True)

if __name__ == "__main__":
    mcp.run()

Namespace & Conflict Handling

Organize tools with prefixes and handle name collisions:

from fastmcp import FastMCP
from lc2mcp import register_tools

mcp = FastMCP("multi-domain")

# Prefix all finance tools
register_tools(mcp, finance_tools, name_prefix="finance.")

# Auto-suffix on collision: tool → tool_2 → tool_3
register_tools(mcp, ops_tools, name_prefix="ops.", on_name_conflict="suffix")

if __name__ == "__main__":
    mcp.run()

📖 API Reference

register_tools()

Convert and register LangChain tools as FastMCP tools on a server.

register_tools(
    mcp: FastMCP,
    tools: list[BaseTool | Callable],
    *,
    name_prefix: str | None = None,           # e.g. "finance." → "finance.get_stock"
    on_name_conflict: str = "error",          # "error" | "overwrite" | "suffix"
    inject_mcp_ctx: bool = False,             # inject mcp_ctx: Context
    runtime_adapter: Callable | None = None,  # Context → ToolRuntime[...]
)

to_mcp_tool()

Convert a single LangChain tool to FastMCP tool for manual registration.

to_mcp_tool(
    tool: BaseTool | Callable,
    *,
    name: str | None = None,
    description: str | None = None,
    args_schema: Type[BaseModel] | None = None,
    inject_mcp_ctx: bool = False,
    runtime_adapter: Callable | None = None,
) -> Callable

🔧 Compatibility

ComponentSupported Versions
Python3.10, 3.11, 3.12+
LangChain>= 1.0.0
FastMCP>= 2.0.0

Tool Support

Tool TypeStatus
@tool decorated functions✅ Full support
StructuredTool✅ Full support
BaseTool subclasses✅ Supported (requires args_schema)

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.


📄 License

MIT License - see LICENSE for details.


Made with ❤️ for the LangChain and MCP communities

Reviews

No reviews yet

Sign in to write a review