MCP Hub
Back to servers

ADC Enterprise Orchestration

An enterprise-grade orchestration system that implements the Model Context Protocol to route queries between specialized servers for text analysis, code review, sentiment analysis, and knowledge management.

Stars
4
Forks
2
Tools
1
Updated
Nov 8, 2025
Validated
Feb 2, 2026

🤖 MCP Enterprise AI Assistant System

A comprehensive Model Context Protocol (MCP) based AI application featuring intelligent server selection, specialized AI tools, and a beautiful web interface for enterprise-grade text analysis, code review, sentiment analysis, and knowledge management.

🌟 Workshop Ready Features

🏗️ True MCP (Model Context Protocol) Implementation

  • 📋 JSON-RPC 2.0 Protocol: Full compliance with JSON-RPC 2.0 specification
  • � MCP Initialize Handshake: Proper server capability negotiation
  • 🔄 Standard MCP Methods: initialize, tools/list, tools/call, resources/list
  • ⚡ WebSocket Transport: Persistent connections as per MCP specification
  • 🎯 Tool Schema Compliance: Proper inputSchema format for tool definitions

🚀 AI-Powered Enterprise Features

  • 🧠 Intelligent Server Selection: AI routing to appropriate servers based on context
  • 📝 Text Analysis: AI summarization, entity extraction, and classification
  • 🔍 Code Review: Automated quality analysis, bug detection, improvements
  • 😊 Sentiment Analysis: Advanced emotion detection and sentiment scoring
  • 📚 Knowledge Management: Document Q&A and information retrieval
  • 🎨 Beautiful Web UI: Professional interface with structured result display
  • 🔧 Configurable AI Models: Support for Ollama (local) and Azure OpenAI

🏗️ System Architecture

┌─────────────────┐    ┌─────────────────┐    ┌─────────────────┐
│   Web Frontend  │────│   Flask App     │────│   MCP Host      │
│   (HTML/JS/CSS) │    │   (Web Server)  │    │ (AI Coordinator)│
└─────────────────┘    └─────────────────┘    └─────────────────┘
                                                        │
                                               ┌─────────────────┐
                                               │   MCP Client    │
                                               │ (Communication) │
                                               └─────────────────┘
                                                        │
        ┌───────────────────┬───────────────────┬───────────────────┬───────────────────┐
        │                   │                   │                   │                   │
┌───────▼────────┐ ┌────────▼────────┐ ┌───────▼────────┐ ┌────────▼────────┐
│ Text Analysis  │ │  Code Review    │ │ Sentiment      │ │   Knowledge     │
│    Server      │ │    Server       │ │   Analysis     │ │    Server       │
│   Port: 8001   │ │   Port: 8002    │ │   Port: 8003   │ │   Port: 8004    │
└────────────────┘ └─────────────────┘ └────────────────┘ └─────────────────┘

📋 Prerequisites

1. Python Installation

2. Ollama Installation & Setup

Install Ollama

  1. Download Ollama: Visit https://ollama.ai/download
  2. Install for your OS:
    • Windows: Download and run the installer
    • macOS: brew install ollama
    • Linux: curl -fsSL https://ollama.ai/install.sh | sh

Download Required AI Model

# Download the Llama 3.2 3B model (recommended for this project)
ollama pull llama3.2:3b

# Verify the model is installed
ollama list

Start Ollama Service

# Start Ollama service (keep this running)
ollama serve

3. Git Installation (for cloning the repository)

🚀 Installation & Setup

Step 1: Clone the Repository

git clone https://github.com/harunraseed07/ADC_MCP_Project.git
cd ADC_MCP_Project

Step 2: Install Python Dependencies

# Install required packages
pip install -r requirements.txt

Step 3: Verify Ollama Model

# Make sure llama3.2:3b is available
ollama list

# If not installed, download it
ollama pull llama3.2:3b

Step 4: Configure the System

The system is pre-configured to use:

  • AI Provider: Ollama (local)
  • Model: llama3.2:3b
  • Ports: 8001-8004 for MCP servers, 5000 for web interface

Configuration files are in the config/ directory.

🎯 Quick Start

Method 1: Automated Start (Recommended)

# Start all servers and web application (Windows)
start_demo_system.bat

# For PowerShell
./start_demo_system.bat

Method 2: Manual Start

# Terminal 1: Start Text Analysis Server
python -m mcp_servers.text_analysis_server

# Terminal 2: Start Code Review Server  
python -m mcp_servers.code_review_server

# Terminal 3: Start Sentiment Analysis Server
python -m mcp_servers.sentiment_analysis_server

# Terminal 4: Start Knowledge Server
python -m mcp_servers.knowledge_server

# Terminal 5: Start Web Application
python web_app/app.py

Step 5: Access the Application

  1. Open your browser
  2. Navigate to: http://localhost:5000
  3. Start interacting with the AI assistant!

💡 Usage Examples

Text Analysis

"Summarize this text: [your text here]"
"Extract entities from: [your text]"
"Classify this content: [your content]"

Code Review

"Review this Python code: def function_name():"
"Check this JavaScript for bugs: [your code]"
"Analyze code quality: [your code]"

Sentiment Analysis

"Analyze sentiment: I love this product!"
"What's the emotion in: [your text]"
"Sentiment of customer feedback: [feedback]"

Knowledge Management

"Search for information about: [topic]"
"What do you know about: [subject]"
"Find documents related to: [query]"

� MCP Protocol Implementation

JSON-RPC 2.0 Compliance

All communication follows JSON-RPC 2.0 specification:

{
    "jsonrpc": "2.0",
    "id": 1,
    "method": "tools/call",
    "params": {
        "name": "summarize_text",
        "arguments": {"text": "Your text here"}
    }
}

MCP Standard Methods

  • initialize: Server capability handshake
  • tools/list: Get available tools with schemas
  • tools/call: Execute tool with arguments
  • resources/list: List available resources
  • resources/read: Read resource content

Tool Schema Format

{
    "name": "summarize_text",
    "description": "AI-powered text summarization",
    "inputSchema": {
        "type": "object",
        "properties": {
            "text": {
                "type": "string",
                "description": "Text to summarize"
            }
        },
        "required": ["text"]
    }
}

�🔧 Configuration

AI Model Configuration

Edit config/mcp_config.json:

{
    "ai_provider": "ollama",
    "ai_model": "llama3.2:3b",
    "ai_base_url": "http://localhost:11434"
}

Server Ports

  • Text Analysis: 8001
  • Code Review: 8002
  • Sentiment Analysis: 8003
  • Knowledge: 8004
  • Web Interface: 5000

🛠️ Troubleshooting

Common Issues

  1. "Ollama not found" Error

    # Make sure Ollama is running
    ollama serve
    
  2. "Model not found" Error

    # Download the required model
    ollama pull llama3.2:3b
    
  3. Port Already in Use

    # Check what's using the ports
    netstat -ano | findstr :8001
    # Kill processes if needed
    kill_all_servers.bat
    
  4. Python Module Not Found

    # Reinstall dependencies
    pip install -r requirements.txt
    

Utility Scripts

  • check_ports.bat - Check which ports are in use
  • kill_all_servers.bat - Stop all running servers
  • start_demo_system.bat - Start entire system

🎓 Workshop Activities

Activity 1: Basic Setup (15 minutes)

  1. Install prerequisites (Python, Ollama)
  2. Download the llama3.2:3b model
  3. Clone and setup the project
  4. Start the system and verify it's working

Activity 2: Understanding MCP (20 minutes)

  1. Explore the system architecture
  2. Examine how AI routing works
  3. Test different types of queries
  4. Observe server selection logic

Activity 3: Customization (25 minutes)

  1. Modify server responses
  2. Add new AI tools
  3. Customize the web interface
  4. Experiment with different AI models

Activity 4: Advanced Features (20 minutes)

  1. Implement custom server logic
  2. Add new MCP servers
  3. Integrate external APIs
  4. Deploy to production environment

📁 Project Structure

ADC_MCP_Project/
├── mcp_servers/           # MCP server implementations
│   ├── text_analysis_server.py
│   ├── code_review_server.py
│   ├── sentiment_analysis_server.py
│   └── knowledge_server.py
├── mcp_client/            # MCP client for communication
│   └── client.py
├── mcp_host/              # AI-powered MCP host
│   ├── host.py
│   └── ai_models.py
├── web_app/               # Flask web application
│   ├── app.py
│   ├── templates/
│   └── static/
├── config/                # Configuration files
│   └── mcp_config.json
├── scripts/               # Utility scripts
└── requirements.txt       # Python dependencies

🤝 Contributing

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature-name
  3. Make your changes
  4. Test thoroughly
  5. Submit a pull request

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🆘 Support

  • Issues: Report bugs and request features via GitHub Issues
  • Documentation: Check the project files for detailed guides
  • Community: Join discussions for help and collaboration

🎉 Acknowledgments

  • Built with the Model Context Protocol (MCP) framework
  • Powered by Ollama and Llama 3.2 AI models
  • Inspired by enterprise AI automation needs

Happy Coding! 🚀 Ready to explore the future of AI-powered enterprise applications!

  • Verify port configurations

4. Web UI not loading:

  • Check if Flask is running on the correct port
  • Verify template files exist
  • Check browser console for errors

Logging

Enable debug logging by setting:

FLASK_DEBUG=True
LOG_LEVEL=DEBUG

View logs in the terminal where you started the application.

📊 Performance

Resource Usage

  • Memory: ~200-500MB (depending on AI model)
  • CPU: Low (spikes during AI inference)
  • Network: Minimal (local WebSocket communication)

Scalability

  • Each server can handle multiple concurrent connections
  • AI model responses are cached for common queries
  • WebSocket connections are persistent and efficient

🔒 Security

Current Implementation

  • Local communication only (localhost)
  • No authentication required
  • Mock data for demonstration

Production Considerations

  • Add authentication and authorization
  • Use HTTPS/WSS for encrypted communication
  • Implement rate limiting
  • Add input validation and sanitization
  • Use real databases with proper security

🤝 Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

Code Style

  • Follow PEP 8 for Python code
  • Use meaningful variable and function names
  • Add docstrings for classes and functions
  • Comment complex logic

📝 License

This project is for demonstration purposes. See LICENSE file for details.

🆘 Support

For issues and questions:

  1. Check the troubleshooting section
  2. Review server logs
  3. Open an issue with detailed information

🎯 Future Enhancements

  • Add more server types (weather, news, etc.)
  • Implement user authentication
  • Add persistent data storage
  • Support for multiple AI models simultaneously
  • Real-time notifications
  • Mobile-responsive improvements
  • API documentation with Swagger
  • Docker containerization
  • Kubernetes deployment support
  • Monitoring and metrics dashboard

Built with ❤️ using the Model Context Protocol (MCP)

Reviews

No reviews yet

Sign in to write a review