Browse 32,334 community-built MCP servers
KFabric is a platform for building traceable and weighted documentary corpora from heterogeneous sources, prioritizing data quality before RAG implementation. It provides tools for document discovery, scoring, and fragment synthesis via an MCP server to prepare high-quality indexable artifacts.
A powerful MCP server for web crawling and search that converts HTML into LLM-optimized Markdown using Mozilla's Readability, featuring SearXNG integration and concurrent processing.
MCP server for consuming and managing Groups.io data via the Groups.io REST API
An MCP server that exposes Eduframe resources as tools, allowing users to manage lead records through the Eduframe API. It enables listing, creating, updating, and deleting leads using natural language commands.
Collective intelligence for AI shopping agents — product intel, deals, and more
Enables exploration and search of local filesystems using glob pattern matching to find files and grep to search for text patterns within files.
AI Agent-Native Data Platform — ingest, validate, transform, query, and search data.
Provides travel planning tools for Cox's Bazar, Bangladesh, including weather forecasts, AI-powered itinerary generation, and pre-configured travel planning prompts.
Compound MCP server — 8 tools for AI agent integration. Hosted by Junct.
Generate expert AI prompts for 140+ platforms with 16-dimension quality scoring.
A serverless implementation of Model Context Protocol (MCP) on Cloudflare Workers that allows AI models to access custom tools without authentication.
Email infrastructure for AI agents. Create inboxes, send/receive email, and search messages.
A Discord-integrated MCP server that provides note storage capabilities, allowing users to create, manage, and summarize notes via URI resources and automated prompts.
Enables web searching through DuckDuckGo and fetching content from webpages. Provides search capabilities with configurable result limits and webpage content extraction for AI assistants.
A secure MCP wrapper for Tennis Warehouse's internal APIs that enables LLMs to search for equipment, check stock, and find deals while protecting sensitive internal data.
Google Docs MCP Pack — read, create, and edit Google Docs via OAuth.
A framework to use with AI to easily create a server for any service. Just drop the API Documentation in it and ask to create the MCP.
Bridges Ghidra's reverse engineering capabilities with AI tools through 179 specialized tools for automated binary analysis and documentation. It supports full read/write access for function decompilation, renaming, and cross-binary documentation transfer in both GUI and headless modes.
A local MCP server that provides tools for working with 7-Zip archives, enabling archive creation, extraction, item management, and file system operations through a standardized interface.
A tool for controlling Spotify playback and managing your music library through natural language, allowing for simple play/pause commands and track management.
Enables AI-powered resume scoring and feedback through secure Google OAuth authentication. Provides FastAPI endpoints for resume evaluation with plans for leaderboard visualization and competitive scoring features.
MCP (Model Context Protocol) Server. Provides a bridge to the Korean military service records API for retrieving and verifying service history information with authentication and caching support
A read-only Model Context Protocol server for Productive.io optimized for LLMs, featuring token-efficient TOON output and comprehensive search across projects, tasks, and documentation.
An event aggregation and management server powered by a Sanity.io backend, allowing users to search, retrieve, and create event data through the Model Context Protocol.
Enables AI assistants to manage API endpoints, environments, and testing workflows through the GASSAPI backend. Provides semantic documentation tools for endpoint cataloging and automated flow creation for backend-to-frontend development workflows.
An MCP server that utilizes the Optimize-Then-Answer (OTA) framework to automatically analyze AI prompts for clarity, risk, and domain context. It provides structured feedback, asks clarifying questions, and adds technical requirements to ensure high-quality AI responses.