MCP Knowledge Base Server
A Model Context Protocol (MCP) server that enables semantic search and document management using your own Supabase and OpenAI accounts.
Live Demo: https://mcp-supabase.maxsmosarski.me/mcp
Features
- 📄 Upload and process documents (text, PDF, images)
- 🔍 Semantic search across your knowledge base
- 🖼️ AI-powered image description and search
- 🔐 Use your own API keys - no shared credentials
- ☁️ Deploy to Cloudflare Workers or run locally
- 🌍 Edge computing with Durable Objects
- 👥 Multi-tenant support via request headers
Example Applications
This repository includes two complete example applications that demonstrate how to build on top of the MCP server:
1. Middle Layer (applications/middle-layer/)
A Python FastAPI server that bridges the MCP server with OpenAI's Agents SDK:
- Provides a conversational AI interface with memory
- Manages conversation sessions and history
- Integrates with OpenAI Agents for advanced reasoning
- Handles file uploads and document processing
- See applications/middle-layer/README.md for setup
2. Web Application (applications/web-app/)
A modern React frontend for the knowledge base system:
- Clean, responsive chat interface
- Document management sidebar
- File upload with drag-and-drop support
- Real-time conversation streaming
- Image preview and management
- See applications/web-app/README.md for setup
Quick Start with Example Apps
# 1. Set up the MCP server (see Quick Start above)
# 2. Set up the middle layer
cd applications/middle-layer
cp .env.example .env # Edit with your API keys
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
python server.py
# 3. Set up the web app
cd ../web-app
npm install
npm run dev # Opens at http://localhost:5173
For detailed setup instructions, see the README files in each application folder.
Quick Start
# Clone the repository
git clone https://github.com/maxsmosarski/mcp-knowledge-base.git
cd mcp-server
# Install dependencies
npm install
# Option 1: Run locally
npm start
# Option 2: Deploy to Cloudflare Workers
wrangler login
wrangler deploy
Three Implementations
This repository contains three implementations of the MCP server:
1. HTTP Server (src/mcp-server.js)
- Uses
@modelcontextprotocol/sdkwith StreamableHTTPServerTransport - Runs as an HTTP server on port 3000 (configurable)
- Perfect for API integrations and web clients
- Can be deployed to any Node.js hosting environment
2. STDIO Server (src/stdio-server.js)
- Uses
@modelcontextprotocol/sdkwith StdioServerTransport - Communicates via standard input/output
- Designed for Claude Desktop and CLI integrations
- Ideal for local tool usage
3. Cloudflare Workers (src/mcp-agent.js)
- Uses Cloudflare's
agentsSDK (v0.0.109) with native Worker support - Implements McpAgent with Durable Objects for stateful sessions
- Credentials passed via request headers for multi-tenant support
- Provides both SSE (
/sse) and streamable HTTP (/mcp) endpoints - Live deployment:
https://mcp-supabase.max-smosarski.workers.dev
Prerequisites
- Supabase account with a configured database
- OpenAI API key
- Node.js 18+ (for local development)
- Cloudflare account (free tier works) for Workers deployment
- Wrangler CLI (
npm install -g wrangler) for deployment
Supabase Setup
Create a new Supabase project and run these SQL commands in the SQL editor:
-- Enable pgvector extension
CREATE EXTENSION IF NOT EXISTS vector;
-- Create documents table
CREATE TABLE documents (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
filename TEXT NOT NULL,
content TEXT,
content_type TEXT NOT NULL DEFAULT 'text',
file_url TEXT,
metadata JSONB DEFAULT '{}',
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- Create document chunks table for semantic search
CREATE TABLE document_chunks (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
document_id UUID REFERENCES documents(id) ON DELETE CASCADE,
content TEXT NOT NULL,
embedding vector(1536),
chunk_index INTEGER,
metadata JSONB DEFAULT '{}',
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- Create search function
CREATE OR REPLACE FUNCTION search_chunks(
query_embedding vector(1536),
match_count INT DEFAULT 5,
similarity_threshold FLOAT DEFAULT 0.3
)
RETURNS TABLE (
id UUID,
document_id UUID,
content TEXT,
filename TEXT,
similarity FLOAT
)
LANGUAGE plpgsql
AS $$
BEGIN
RETURN QUERY
SELECT
dc.id,
dc.document_id,
dc.content,
d.filename,
1 - (dc.embedding <=> query_embedding) AS similarity
FROM document_chunks dc
JOIN documents d ON dc.document_id = d.id
WHERE 1 - (dc.embedding <=> query_embedding) > similarity_threshold
ORDER BY dc.embedding <=> query_embedding
LIMIT match_count;
END;
$$;
-- Create storage bucket for images (in Supabase Dashboard > Storage)
-- Create a bucket named 'images' with public access
Installation
# Clone the repository
git clone <your-repo-url>
cd mcp-server
# Install dependencies
npm install
Running Locally
Option 1: HTTP Server (for API access)
# Set environment variables (optional defaults)
export SUPABASE_URL="your-supabase-url"
export SUPABASE_SERVICE_KEY="your-supabase-key"
export OPENAI_API_KEY="your-openai-key"
# Start the HTTP server
npm start
# Server runs on http://localhost:3000
# Development mode with auto-reload
npm run dev
Option 2: STDIO Server (for Claude Desktop)
Run directly:
# Set environment variables
export SUPABASE_URL="your-supabase-url"
export SUPABASE_SERVICE_KEY="your-supabase-key"
export OPENAI_API_KEY="your-openai-key"
# Start STDIO server
npm run start:stdio
Or add to your Claude Desktop configuration:
{
"mcpServers": {
"knowledge-base": {
"command": "node",
"args": ["/path/to/mcp-server/start-stdio.js"],
"env": {
"SUPABASE_URL": "your-supabase-url",
"SUPABASE_SERVICE_KEY": "your-supabase-key",
"OPENAI_API_KEY": "your-openai-key"
}
}
}
}
Deploying to Cloudflare Workers
Using the Agents SDK Implementation
- Login to Cloudflare:
wrangler login
- Deploy the Worker:
# Deploy to production
npm run deploy
# Or use wrangler directly
wrangler deploy
# Development server (local testing)
npm run deploy:dev
- Important Notes:
- Uses Durable Objects for stateful MCP sessions
- Free tier requires
new_sqlite_classesin migrations - Credentials are passed via headers, not environment variables
- Each request must include credential headers
The deployed worker will be available at:
- Health check:
https://your-worker.workers.dev/ - SSE endpoint:
https://your-worker.workers.dev/sse - MCP endpoint:
https://your-worker.workers.dev/mcp
Cloudflare Configuration
Note: The Cloudflare Workers implementation uses request headers for credentials, not environment variables. This allows multi-tenant usage where each user provides their own API keys.
Required Headers for Each Request:
x-supabase-url: Your Supabase project URLx-supabase-key: Your Supabase service keyx-openai-key: Your OpenAI API key
Durable Objects Configuration (in wrangler.toml):
[[durable_objects.bindings]]
name = "MCP_OBJECT"
class_name = "KnowledgeBaseMCP"
[[migrations]]
tag = "v1"
new_sqlite_classes = ["KnowledgeBaseMCP"] # Required for free tier
Usage
API Examples
1. Initialize MCP Session (Required First)
const response = await fetch('https://mcp-supabase.max-smosarski.workers.dev/mcp', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Accept': 'application/json, text/event-stream',
'x-supabase-url': 'https://your-project.supabase.co',
'x-supabase-key': 'your-service-key',
'x-openai-key': 'sk-...'
},
body: JSON.stringify({
jsonrpc: '2.0',
method: 'initialize',
params: { protocolVersion: '2025-06-18' },
id: 1
})
});
// Save the session ID from response headers
const sessionId = response.headers.get('Mcp-Session-Id');
2. Search Documents
fetch('https://mcp-supabase.max-smosarski.workers.dev/mcp', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Accept': 'application/json, text/event-stream',
'Mcp-Session-Id': sessionId,
'x-supabase-url': 'https://your-project.supabase.co',
'x-supabase-key': 'your-service-key',
'x-openai-key': 'sk-...'
},
body: JSON.stringify({
jsonrpc: '2.0',
method: 'tools/call',
params: {
name: 'search_chunks',
arguments: { query: 'your search query', match_count: 5 }
},
id: 2
})
});
3. List All Files
fetch('https://mcp-supabase.max-smosarski.workers.dev/mcp', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Accept': 'application/json, text/event-stream',
'Mcp-Session-Id': sessionId,
'x-supabase-url': 'https://your-project.supabase.co',
'x-supabase-key': 'your-service-key',
'x-openai-key': 'sk-...'
},
body: JSON.stringify({
jsonrpc: '2.0',
method: 'tools/call',
params: {
name: 'get_files',
arguments: {}
},
id: 3
})
});
For Local MCP Server
MCP_SERVER_URL=http://localhost:3000/mcp
For Cloudflare Workers
MCP_SERVER_URL=https://mcp-supabase.max-smosarski.workers.dev/mcp
# Also set your credentials in .env:
SUPABASE_URL=your-supabase-url
SUPABASE_SERVICE_KEY=your-supabase-key
OPENAI_API_KEY=your-openai-key
The middle layer automatically passes credentials as headers to the Cloudflare Worker.
Available Tools
upload_document- Upload text or PDF documentsupload_image- Upload and analyze imagessearch_chunks- Semantic search across documentsget_files- List all documentsget_document- Retrieve specific documentdelete_document- Delete a documentdelete_documents- Bulk delete documents
Directory Structure
mcp-server/
├── src/
│ ├── mcp-server.js # Standard MCP implementation
│ ├── mcp-agent.js # Cloudflare Agents SDK implementation
│ ├── stdio-server.js # STDIO transport implementation
│ ├── index.js # REST API wrapper
│ ├── tools/ # Tool implementations
│ │ ├── upload-document.js
│ │ ├── upload-image.js
│ │ ├── search-chunks.js
│ │ ├── get-files.js
│ │ ├── get-document.js
│ │ ├── delete-document.js
│ │ └── delete-documents.js
│ └── services/ # Service implementations
│ ├── supabase.js
│ └── openai.js
├── wrangler.toml # Cloudflare Workers configuration
│ # Includes Durable Objects bindings
├── package.json
├── start-mcp.js # MCP server starter
├── start-stdio.js # STDIO server starter
└── README.md
Environment Variables
For Local Development (HTTP/STDIO servers):
SUPABASE_URL- Your Supabase project URLSUPABASE_SERVICE_KEY- Your Supabase service keyOPENAI_API_KEY- Your OpenAI API keyMCP_PORT- Port for HTTP server (default: 3000)
For Cloudflare Workers:
Credentials are passed via request headers, not environment variables:
x-supabase-url- Supabase URL in request headerx-supabase-key- Supabase key in request headerx-openai-key- OpenAI key in request header
This design allows multiple users to use the same deployment with their own credentials.
Testing
# Test Supabase connection
npm run test:supabase
# Full MCP test suite
npm run test:full
# Test with MCP client
npm run test:client
# Database utilities
npm run db:clean # Clean test data
npm run db:debug # Debug database state
Troubleshooting
Cloudflare Workers Issues
-
"Invalid binding" error:
- Ensure Durable Objects are configured in
wrangler.toml - Use
new_sqlite_classesfor free tier accounts - Check that the binding name matches (
MCP_OBJECT)
- Ensure Durable Objects are configured in
-
"Missing credentials" error:
- Ensure request headers include all required credentials
- Check middle layer is passing credentials in headers
- Verify credential values are correct
-
Build errors with duplicate exports:
- Don't re-export classes that use
export class - Check for multiple exports of the same name
- Don't re-export classes that use
-
405 Method Not Allowed:
- Normal for GET/DELETE on certain endpoints
- MCP protocol uses specific HTTP methods
-
Durable Objects on free tier:
- Must use
new_sqlite_classesinstead ofnew_classes - Error code 10097 indicates this issue
- Must use
Local Development Issues
- Port conflicts: Change the port using
MCP_PORTenvironment variable - Credential issues: Ensure all environment variables are set correctly
- CORS errors: The server includes appropriate CORS headers
Migration Guide
From Local to Cloudflare Workers
-
Update middle layer
.env:# Change from: MCP_SERVER_URL=http://localhost:3000/mcp # To: MCP_SERVER_URL=https://mcp-supabase.max-smosarski.workers.dev/mcp -
Ensure credentials in middle layer
.env:SUPABASE_URL=your-url SUPABASE_SERVICE_KEY=your-key OPENAI_API_KEY=your-key -
Deploy to Cloudflare:
wrangler deploy