Personal Library MCP Server Demo
What is this?
This is a functional Model Context Protocol (MCP) server built with the FastMCP framework. It provides a structured interface for an AI model to interact with a local SQLite database that tracks a personal reading list.
Why use it?
This project is a very simple demo designed to see the Model Context Protocol (MCP) in action. It serves as a minimal, "Hello World" style example to help you nail the basics of:
- Resources: Exposing data (like a list of books) as readable URI-based resources.
- Tools: Providing actionable functions (like adding or searching books) that an AI can call.
- Client-Server Communication: Demonstrating how a client and server interact using the standard
stdiotransport.
Getting Started
Prerequisites
- uv installed on your system.
- Python 3.10 or later.
Setup and Running the Demo
-
Initialize the Database: Create the SQLite database and populate it with sample data:
uv run python init_db.py -
Run the Smoke Test: This script acts as a smoke test for your MCP server. It starts the server in the background and simulates how an AI model would interact with it (reading resources, calling tools) without needing an actual AI model connected:
uv run python main.py
Project Structure
server.py: The MCP server implementation usingFastMCP.main.py: A smoke test script that demonstrates how to interact with the server.init_db.py: A setup script to create the localbooks.dbSQLite database.pyproject.toml: Project configuration and dependencies (managed byuv).books.db: The local SQLite database (created after runninginit_db.py).
Using as a Tool with AI Assistants
You can connect this server to any MCP-compatible client. Replace /absolute/path/to/mcp-demo with /Users/sanka/Documents/workspace/mcp-demo in the examples below.
1. Gemini CLI
You can add the server automatically using the Gemini CLI:
gemini mcp add --scope project personal-library uv --directory $(pwd) run python server.py
Or manually add this to .gemini/settings.json:
{
"mcpServers": {
"personal-library": {
"command": "uv",
"args": ["--directory", "/Users/sanka/Documents/workspace/mcp-demo", "run", "python", "server.py"],
"trust": true
}
}
}
2. Claude Desktop
Add this to your claude_desktop_config.json (typically in ~/Library/Application Support/Claude/ on macOS):
{
"mcpServers": {
"personal-library": {
"command": "uv",
"args": ["--directory", "/Users/sanka/Documents/workspace/mcp-demo", "run", "python", "server.py"]
}
}
}
3. Cline (VS Code Extension)
Open the MCP Settings in Cline or edit cline_mcp_settings.json:
{
"mcpServers": {
"personal-library": {
"command": "uv",
"args": ["--directory", "/Users/sanka/Documents/workspace/mcp-demo", "run", "python", "server.py"]
}
}
}
Sample Prompts for AI Agents
Once you've connected the server to your favorite AI assistant, try these prompts:
- List Resources: "What books are currently in my reading list?"
- Search: "Find 'The Martian' in my library." or "Do I have any books by Frank Herbert?"
- Add a Book: "Add 'Project Hail Mary' by Andy Weir to my library. It's a Sci-Fi book from 2021."
- Update Status: "I just finished reading 'Dune', can you mark it as read?" or "I just bought 'The Road', mark it as owned."
- Check Details: "Show me the full metadata for 'The Lord of the Rings'."
- Combined Task: "Look at my library and tell me which Sci-Fi books I haven't read yet."
Naming Conventions
- Server Name (Configuration): The key used in
settings.json(e.g.,"personal-library") is a unique identifier for your AI client to manage multiple servers. - Display Name (Code): The name passed to
FastMCP("Personal Library Manager")inserver.pyis what appears in the UI of apps like Claude Desktop. - Tool/Resource Names: These (e.g.,
add_book,library://...) must match exactly betweenserver.pyandmain.py.
You don't need to use these names in your prompts! The AI assistant automatically discovers all available tools and resources once the server is connected.