🤖 MCP + CrewAI Agentic Integration 🚀
A powerful demonstration of Model Context Protocol (MCP) integrated with CrewAI orchestrations, featuring full observability through AgentOps and high-speed inference via Groq.
🌟 Overview
This project bridges the gap between context-aware tools and autonomous agents. It provides a custom MCP server for real-time external data (Weather, News, Notes) while leveraging CrewAI to orchestrate multi-agent workflows.
🏗️ Architecture
- MCP Layer: A
FastMCPserver exposing tools for real-time data retrieval.
- Agentic Layer:
CrewAIagents specialized in Market Analysis and Research.
- Inference Layer: Ultra-fast LLMs (Llama 3.1) hosted on
Groq. - Observability Layer:
AgentOpsfor tracing, cost management, and debugging.
✨ Key Features
🛠️ Custom MCP Server Tools
- ☀️ Weather Engine: Real-time meteorology data via WeatherAPI.
- 📰 News Intelligence: Global news retrieval via Serper (Google Search API).
- 📝 Contextual Notes: Locally persistent note management for long-term memory.
- � Auto-Summary: Intelligent summarization of collected context.
👥 Intelligence Crew
- 🔍 Market Researcher: Scours data to identify emerging trends.
- 📈 Data Analyst: Synthesizes research into actionable market insights.
- 🚀 Sequential Workflow: Fully orchestrated execution path for reliable results.
🛠️ Tech Stack
- Framework: CrewAI
- Server: FastMCP
- LLM Engine: Groq (Llama 3.1 8B/70B)
- Tracing: AgentOps
- Package Manager: uv
🚀 Getting Started
1. Prerequisites
Ensure you have the following installed:
- uv (Recommended) or Python 3.13+
- A valid Groq API Key
- A valid AgentOps API Key
- A Serper API Key (for News)
2. Installation
Clone the repository and sync dependencies:
git clone https://github.com/vad-007/MCP_Integration_crewai.git
cd MCP_Integration_crewai
uv sync
3. Configuration
Create a .env file in the root directory:
AGENTOPS_API_KEY=your_agentops_key
GROQ_API_KEY=your_groq_key
SERPER_API_KEY=your_serper_key
WEATHER_API_KEY=your_weather_key
4. Running the Project
🌐 Start the MCP Server
mcp dev main.py
🚢 Run the CrewAI Integration
python crewai_agentops_integration.py
🔍 Run Diagnostics
python test_agentops.py
📊 Observability with AgentOps
This project is fully instrumented. Every run generates a unique replay URL allowed you to:
- Watch Agent Self-Correction: See exactly how agents reason through tasks.
- Trace LLM Calls: Monitor every prompt and completion.
- Analyze Latency: Visualize the execution timeline of your crew.
Check your dashboard at: app.agentops.ai
📂 Project Structure
├── main.py # FastMCP Server implementation
├── crewai_agentops_integration.py # Main CrewAI orchestration
├── test_agentops.py # Connectivity & Diagnostic tool
├── .env # Environment variables (private)
├── pyproject.toml # Project configuration
├── uv.lock # Dependency lockfile
└── docs/ # Troubleshooting & Optimization guides
🤝 Contributing
Contributions are what make the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature) - Commit your Changes (
git commit -m 'Add some AmazingFeature') - Push to the Branch (
git push origin feature/AmazingFeature) - Open a Pull Request
🛡️ License
Distributed under the MIT License. See LICENSE for more information.
Developed with ❤️ for the AI Community.