@ecosyste-ms/mcp
MCP server for querying package ecosystem data from ecosyste.ms.
Queries a local SQLite database of critical packages for fast lookups, with API fallback for packages not in the database.
Installation
npm install -g @ecosyste-ms/mcp
Or run directly with npx:
npx @ecosyste-ms/mcp
The database is bundled via @ecosyste-ms/critical. No additional setup required.
To use a custom database, set ECOSYSTEMS_DB_PATH or place critical-packages.db in your working directory.
Usage with LLM Tools
Claude Code
Open a terminal and run:
claude mcp add ecosystems -- npx @ecosyste-ms/mcp
With a custom database path:
claude mcp add ecosystems -- env ECOSYSTEMS_DB_PATH=/path/to/db.sqlite npx @ecosyste-ms/mcp
From within Claude Code, use the /mcp command to verify the server is running.
Claude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"ecosystems": {
"command": "npx",
"args": ["@ecosyste-ms/mcp"]
}
}
}
With custom database path:
{
"mcpServers": {
"ecosystems": {
"command": "npx",
"args": ["@ecosyste-ms/mcp"],
"env": {
"ECOSYSTEMS_DB_PATH": "/path/to/critical-packages.db"
}
}
}
}
Cursor
Add to .cursor/mcp.json in your project or ~/.cursor/mcp.json globally:
{
"mcpServers": {
"ecosystems": {
"command": "npx",
"args": ["@ecosyste-ms/mcp"]
}
}
}
VS Code
Open a terminal and run:
code --add-mcp '{"type":"stdio","name":"ecosystems","command":"npx","args":["@ecosyste-ms/mcp"]}'
Or manually add to .vscode/mcp.json:
{
"servers": {
"ecosystems": {
"type": "stdio",
"command": "npx",
"args": ["@ecosyste-ms/mcp"]
}
}
}
Then open the .vscode/mcp.json file in VS Code and click "Start server".
Windsurf
Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"ecosystems": {
"command": "npx",
"args": ["@ecosyste-ms/mcp"]
}
}
}
Zed
Add to Zed settings (cmd+,):
{
"context_servers": {
"ecosystems": {
"command": {
"path": "npx",
"args": ["@ecosyste-ms/mcp"]
}
}
}
}
ChatGPT
Note: ChatGPT requires remote MCP servers. Run the server with a tunnel or deploy it.
For local development with a tunnel:
npx @anthropic-ai/mcp-proxy --port 8080 -- npx @ecosyste-ms/mcp
Then in ChatGPT:
- Navigate to Settings > Connectors
- Add a custom connector with your tunnel URL
- The server will be available in Composer > Deep Research
Codex
Add to ~/.codex/config.toml:
[mcp_servers.ecosystems]
command = "npx"
args = ["@ecosyste-ms/mcp"]
Gemini CLI
Add to ~/.gemini/settings.json:
{
"mcpServers": {
"ecosystems": {
"command": "npx",
"args": ["@ecosyste-ms/mcp"]
}
}
}
Available Tools
Package Tools
- get_package - Get full package data by ecosystem and name
- lookup_package - Find package by PURL, ecosystem+name, or repository URL
- get_package_versions - Get all versions with publish dates
- get_package_advisories - Get security advisories (CVEs)
- get_package_repository - Get repository metadata (stars, forks, language)
- get_package_dependents - Get packages that depend on this package
- search_packages - Full-text search (requires local database)
Registry Tools
- list_registries - List all available package registries
- get_database_info - Get local database stats
- health_check - Check server health (database connectivity, API availability)
Examples
Ask your LLM:
- "What license does lodash use?"
- "Show me the CVEs for express"
- "How many stars does the react repository have?"
- "What packages depend on typescript?"
- "Search for packages related to authentication"
- "What's the latest version of axios?"
Supported Ecosystems
npm, pypi, rubygems, cargo, go, maven, nuget, packagist, hex, pub, hackage, cocoapods, conda, clojars, puppet, homebrew, docker, bower, cpan, cran, julia, swiftpm, elm, deno, alpine, actions, openvsx, spack, adelie, vcpkg, racket, bioconductor, carthage, postmarketos, elpa
Development
git clone https://github.com/ecosyste-ms/mcp
cd mcp
npm install
npm test
Run locally:
node index.js
License
MIT