MCP Hub
Back to servers

volcano-sdk

The Volcano SDK is a TypeScript framework designed for building multi-provider AI agents that orchestrate MCP tools, supporting advanced patterns like parallel execution and multi-agent delegation.

Stars
383
Forks
28
Updated
Nov 18, 2025
Validated
Jan 9, 2026

CI License npm

🌋 Volcano SDK

The TypeScript SDK for Multi-Provider AI Agents

Build agents that chain LLM reasoning with MCP tools. Mix OpenAI, Claude, Mistral in one workflow. Parallel execution, branching, loops. Native retries, streaming, and typed errors.

📚 Read the full documentation at volcano.dev →

✨ Features

🤖 Automatic Tool Selection

LLM automatically picks which MCP tools to call based on your prompt. No manual routing needed.

🧩 Multi-Agent Crews

Define specialized agents and let the coordinator autonomously delegate tasks. Like automatic tool selection, but for agents.

💬 Conversational Results

Ask questions about what your agent did. Use .summary() or .ask() instead of parsing JSON.

🔧 100s of Models

OpenAI, Anthropic, Mistral, Bedrock, Vertex, Azure. Switch providers per-step or globally.

🔄 Advanced Patterns

Parallel execution, branching, loops, sub-agent composition. Enterprise-grade workflow control.

📡 Streaming

Stream tokens in real-time as LLMs generate them. Perfect for chat UIs and SSE endpoints.

🛡️ TypeScript-First

Full type safety with IntelliSense. Catch errors before runtime.

📊 Observability

OpenTelemetry traces and metrics. Export to Jaeger, Prometheus, DataDog, or any OTLP backend.

⚡ Production-Ready

Built-in retries, timeouts, error handling, and connection pooling. Battle-tested at scale.

Explore all features →

Quick Start

Installation

npm install volcano-sdk

That's it! Includes MCP support and all common LLM providers (OpenAI, Anthropic, Mistral, Llama, Vertex).

View installation guide →

Hello World with Automatic Tool Selection

import { agent, llmOpenAI, mcp } from "volcano-sdk";

const llm = llmOpenAI({ 
  apiKey: process.env.OPENAI_API_KEY!, 
  model: "gpt-4o-mini" 
});

const weather = mcp("http://localhost:8001/mcp");
const tasks = mcp("http://localhost:8002/mcp");

// Agent automatically picks the right tools
const results = await agent({ llm })
  .then({ 
    prompt: "What's the weather in Seattle? If it will rain, create a task to bring an umbrella",
    mcps: [weather, tasks]  // LLM chooses which tools to call
  })
  .run();

// Ask questions about what happened
const summary = await results.summary(llm);
console.log(summary);

Multi-Agent Coordinator

import { agent, llmOpenAI } from "volcano-sdk";

const llm = llmOpenAI({ apiKey: process.env.OPENAI_API_KEY! });

// Define specialized agents
const researcher = agent({ llm, name: 'researcher', description: 'Finds facts and data' })
  .then({ prompt: "Research the topic." })
  .then({ prompt: "Summarize the research." });

const writer = agent({ llm, name: 'writer', description: 'Creates content' })
  .then({ prompt: "Write content." });

// Coordinator autonomously delegates to specialists
const results = await agent({ llm })
  .then({
    prompt: "Write a blog post about quantum computing",
    agents: [researcher, writer]  // Coordinator decides when done
  })
  .run();

// Ask what happened
const post = await results.ask(llm, "Show me the final blog post");
console.log(post);

View more examples →

Documentation

📖 Comprehensive Guides

Contributing

We welcome contributions! Please see our Contributing Guide for details.

Questions or Feature Requests?

License

Apache 2.0 - see LICENSE file for details.

Reviews

No reviews yet

Sign in to write a review