🤖 MCP Multi-Agent Deep Researcher
A powerful multi-agent research system built on the Model Context Protocol (MCP), using CrewAI for agent orchestration, LinkUp for deep web search, and phi3 model (via Ollama) for local AI processing. Features both API access and a beautiful web interface for easy research tasks.
🌟 Features
- 🧠 Multi-Agent System: Three specialized AI agents working together
- Web Searcher: Deep web search using LinkUp API
- Research Analyst: Information synthesis and verification
- Technical Writer: Clear, structured content creation
- 🌐 Beautiful Web Interface: Modern, responsive frontend for easy interaction
- 🔌 API Access: RESTful API with FastAPI and automatic documentation
- 🏠 Local AI Processing: Uses Ollama with phi3 model - no external AI API needed
- 📡 MCP Protocol: Full Model Context Protocol compliance for integration
- 🚀 One-Command Launch: Start everything with a single command
🎯 Quick Start
Get started in 3 simple steps:
# 1. Clone the repository
git clone https://github.com/anubhav-77-dev/MCP-Multi-Agent-Deep-Researcher.git
cd MCP-Multi-Agent-Deep-Researcher
# 2. Run the setup script
python3 setup.py
# 3. Launch everything!
python3 launcher.py
That's it! The system will automatically open in your browser at http://localhost:3000/frontend.html
📸 Screenshots
Web Interface
API Documentation
🏗️ Architecture
Architecture
The system implements a three-agent workflow:
- Web Searcher: Uses LinkUp API to find relevant information from multiple sources
- Research Analyst: Synthesizes and verifies the information, focusing on depth and clarity
- Technical Writer: Produces a clear, comprehensive markdown answer
graph TD
A[Web Interface] --> B[FastAPI Backend]
B --> C[MCP Server]
C --> D[CrewAI Orchestrator]
D --> E[Web Searcher Agent]
D --> F[Research Analyst Agent]
D --> G[Technical Writer Agent]
E --> H[LinkUp API]
F --> I[Ollama + phi3]
G --> I
H --> J[Web Search Results]
I --> K[AI Analysis & Writing]
J --> F
K --> L[Final Research Output]
📋 Prerequisites
- Python 3.10+
- Poetry for dependency management
- Ollama for local AI processing
- LinkUp API Key (get free tier at LinkUp.so)
🚀 Installation & Setup
Option 1: Automated Setup (Recommended)
# Clone the repository
git clone https://github.com/anubhav-77-dev/MCP-Multi-Agent-Deep-Researcher.git
cd MCP-Multi-Agent-Deep-Researcher
# Run automated setup (checks dependencies, installs packages, configures environment)
python3 setup.py
# Launch the application (starts both frontend and backend)
python3 launcher.py
Option 2: Manual Setup
Click to expand manual setup instructions
1. Install Dependencies
# Install Poetry (if not already installed)
curl -sSL https://install.python-poetry.org | python3 -
# Install project dependencies
poetry install
2. Install & Configure Ollama
# Install Ollama (visit https://ollama.ai/ for OS-specific instructions)
# On macOS:
brew install ollama
# Start Ollama service
ollama serve
# Pull the phi3 model (in a new terminal)
ollama pull phi3:latest
# Verify installation
ollama list
3. Configure Environment
# Copy environment template
cp .env.example .env
# Edit .env file and add your LinkUp API key
nano .env
Required environment variables:
LINKUP_API_KEY=your_linkup_api_key_here
OLLAMA_BASE_URL=http://localhost:11434
MODEL_NAME=phi3:latest
4. Start the Services
# Option A: Use the launcher (recommended)
python3 launcher.py
# Option B: Start services manually
# Terminal 1 - Backend API
poetry run python Multi-Agent-deep-researcher-mcp-windows-linux/http_server.py
# Terminal 2 - Frontend
python3 -m http.server 3000
# Terminal 3 - MCP Server (optional, for MCP client integration)
poetry run python Multi-Agent-deep-researcher-mcp-windows-linux/server.py
🔑 Get Your LinkUp API Key
- Visit LinkUp.so
- Sign up for a free account
- Get your API key from the dashboard
- Add it to your
.envfile
🎮 Usage
Web Interface (Easiest)
- Launch the application:
python3 launcher.py - Open your browser to
http://localhost:3000/frontend.html(opens automatically) - Enter your research query or try the example queries
- Choose your mode:
- 🔍 Quick Search: Fast web search with LinkUp API
- 🧠 Full Research: Complete multi-agent analysis workflow
- View results with formatted output, copy/download options
API Access
Quick Search
curl -X POST http://localhost:8080/search \
-H "Content-Type: application/json" \
-d '{"query": "latest AI trends 2024"}'
Full Research
curl -X POST http://localhost:8080/research \
-H "Content-Type: application/json" \
-d '{"query": "comprehensive analysis of quantum computing applications"}'
Health Check
curl http://localhost:8080/health
MCP Client Integration
For integration with MCP-compatible clients, add this configuration:
{
"mcpServers": {
"crew_research": {
"command": "poetry",
"args": ["run", "python", "Multi-Agent-deep-researcher-mcp-windows-linux/server.py"],
"env": {
"LINKUP_API_KEY": "your_linkup_api_key_here"
}
}
}
}
Available Endpoints
| Endpoint | Method | Description |
|---|---|---|
/health | GET | Health check |
/search | POST | Quick web search |
/research | POST | Full multi-agent research |
/docs | GET | Interactive API documentation |
📁 Project Structure
MCP-Multi-Agent-Deep-Researcher/
├── 🚀 launcher.py # Single-command launcher
├── 🌐 frontend.html # Web interface
├── ⚙️ setup.py # Automated setup script
├── 📋 start.sh # Shell launcher script
├── 📖 QUICKSTART.md # Quick start guide
├── 🔧 Makefile # Development commands
├── 📦 pyproject.toml # Poetry dependencies
├── 🔐 .env.example # Environment template
├── ⚙️ mcp.config.json # MCP client configuration
└── Multi-Agent-deep-researcher-mcp-windows-linux/
├── 🖥️ server.py # MCP protocol server
├── 🌐 http_server.py # FastAPI REST server
├── 🧪 test_research.py # Testing utilities
└── agents/ # Multi-agent system
├── 🤖 research_crew.py # CrewAI orchestration
└── tools/ # Agent tools
├── 🔍 linkup_search.py # Web search integration
└── 🧠 ollama_tool.py # Local AI integration
🎯 Example Queries
Try these sample research queries:
Quick Search Examples
- "What are the latest AI trends in 2024?"
- "Current developments in renewable energy"
- "Recent breakthroughs in quantum computing"
Full Research Examples
- "Comprehensive analysis of the environmental impact of cryptocurrency mining"
- "How does quantum computing work and what are its real-world applications?"
- "The future of autonomous vehicles: technology, challenges, and timeline"
- "Impact of artificial intelligence on healthcare: opportunities and risks"
Agentic Workflow
The system uses CrewAI to orchestrate three specialized agents:
1. Web Searcher Agent
- Role: Web Research Specialist
- Goal: Find comprehensive and relevant information using LinkUp API
- Tools: LinkUp Search Tool
- Output: Detailed summary of web search results with sources
2. Research Analyst Agent
- Role: Research Analyst
- Goal: Analyze and synthesize information to provide comprehensive insights
- Input: Web search results
- Output: Structured analysis with key insights and verified information
3. Technical Writer Agent
- Role: Technical Writer
- Goal: Create clear, comprehensive, and well-structured written content
- Input: Research analysis
- Output: Comprehensive, well-formatted markdown document
Configuration
Environment Variables
LINKUP_API_KEY: Your LinkUp API key for web search functionalityOLLAMA_BASE_URL: Base URL for Ollama API (default: http://localhost:11434)MODEL_NAME: Ollama model to use (default: phi3)
Customizing Agents
You can customize the agents by modifying agents/research_crew.py:
- Adjust agent roles, goals, and backstories
- Modify task descriptions and expected outputs
- Add or remove tools for specific agents
- Change the process flow (sequential, hierarchical, etc.)
🐛 Troubleshooting
Common Issues & Solutions
🚫 "Address already in use" error
# Kill existing processes
pkill -f "python.*http_server"
pkill -f "http.server"
# Or restart with different ports
python3 launcher.py
🔗 Ollama connection failed
# Check if Ollama is running
ollama serve
# Verify model is available
ollama list
# Pull model if missing
ollama pull phi3:latest
# Check Ollama is accessible
curl http://localhost:11434/api/tags
🔑 LinkUp API errors
- Verify API key in
.envfile - Check LinkUp dashboard for usage limits
- Test API key:
curl -H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{"q": "test"}' \
https://api.linkup.so/v1/search
🧩 Dependencies issues
# Reinstall dependencies
poetry install --no-cache
# Or use pip fallback
pip install -r requirements.txt
# Check Python version
python3 --version # Should be 3.10+
🌐 CORS/Frontend issues
- Make sure both servers are running
- Check browser console for errors
- Try accessing backend directly:
http://localhost:8080/health - Clear browser cache and reload
Getting Help
- Check logs: The launcher shows detailed logs for both servers
- Run diagnostics:
python3 setup.pyto verify setup - Test components:
python3 simple_test.pyfor individual tests - Enable debug mode: Set
logging.basicConfig(level=logging.DEBUG)in server files
🛠️ Development
Quick Commands
# Start everything
make start # or make launch, make demo
# Development setup
make dev-setup # Install deps + setup + verify
# Run tests
make test # Basic functionality test
make quick-test # Quick search test
# Maintenance
make clean # Clean cache files
make verify # Verify installation
Adding Custom Agents
- Create new agent in
agents/research_crew.py:
custom_agent = Agent(
role='Custom Specialist',
goal='Your specific goal',
backstory='Agent background',
tools=[your_tools]
)
- Add to crew workflow:
custom_task = Task(
description="Task description",
agent=custom_agent,
expected_output="Expected result format"
)
Adding New Tools
- Create tool file in
agents/tools/:
class CustomTool(BaseTool):
name = "Custom Tool"
description = "Tool description"
def _run(self, query: str) -> str:
# Tool implementation
return result
- Register with agents in
research_crew.py
Environment Configuration
| Variable | Description | Default |
|---|---|---|
LINKUP_API_KEY | LinkUp search API key | Required |
OLLAMA_BASE_URL | Ollama server URL | http://localhost:11434 |
MODEL_NAME | Ollama model name | phi3:latest |
OPENAI_API_KEY | Set to ollama for local use | ollama |
OPENAI_API_BASE | Ollama OpenAI-compatible endpoint | http://localhost:11434/v1 |
🤝 Contributing
We welcome contributions! Here's how to get started:
Development Setup
# Fork and clone the repo
git clone https://github.com/anubhav-77-dev/MCP-Multi-Agent-Deep-Researcher.git
cd MCP-Multi-Agent-Deep-Researcher
# Install development dependencies
poetry install --with dev
# Run pre-commit setup
pre-commit install
Contribution Guidelines
- 🍴 Fork the repository
- 🌿 Create a feature branch:
git checkout -b feature/amazing-feature - ✨ Make your changes with clear, commented code
- 🧪 Add tests for new functionality
- ✅ Run tests:
make test - 📝 Update documentation as needed
- 🚀 Submit a pull request
Areas for Contribution
- 🔧 New agent tools and integrations
- 🎨 Frontend UI/UX improvements
- 📚 Documentation and examples
- 🧪 Test coverage expansion
- 🐛 Bug fixes and performance improvements
- 🌍 Internationalization
📊 Performance & Scaling
- Quick Search: ~2-5 seconds (LinkUp API dependent)
- Full Research: ~30-60 seconds (depends on query complexity)
- Concurrent Users: Supports multiple simultaneous requests
- Memory Usage: ~500MB-1GB (Ollama model dependent)
- Disk Space: ~3GB (including phi3 model)
🔒 Security & Privacy
- ✅ Local AI Processing: No data sent to external AI services
- ✅ API Key Security: LinkUp API key stored locally only
- ✅ No Data Persistence: Research queries not stored by default
- ✅ CORS Protection: Configurable origin restrictions
- ⚠️ Web Search: Queries sent to LinkUp API (see their privacy policy)
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
Built with amazing open-source technologies:
- 🤖 CrewAI - Multi-agent orchestration framework
- 🔍 LinkUp - Deep web search API
- 🧠 Ollama - Local LLM serving platform
- ⚡ FastAPI - Modern Python web framework
- 🎭 Model Context Protocol - AI integration standard
- 📦 Poetry - Python dependency management
⭐ Star History
Made with ❤️ for the AI research community