Gemini Researcher
A lightweight, stateless MCP (Model Context Protocol) server that lets developer agents (Claude Code, GitHub Copilot) delegate deep repository analysis to the Gemini CLI. The server is read-only, returns structured JSON (as text content), and is optimized to reduce the calling agent's context and model usage.
Status: v1 complete. Core features are stable, but still early days. Feedback welcome!
If this project extended the lifespan of your usage window, ⭐ please consider giving it a star! :)
Primary goals:
- Reduce agent context usage by letting Gemini CLI read large codebases locally and do its own research
- Reduce calling-agent model usage by offloading heavy analysis to Gemini
- Keep the server stateless and read-only for safety
Why use this?
Instead of copying entire files into your agent's context (burning tokens and cluttering the conversation), this server lets Gemini CLI read files directly from your project. Your agent sends a research query, Gemini does the heavy lifting with its large context window, and returns structured results. You save tokens, your agent stays focused, and complex codebase analysis becomes practical.
Verified clients: Claude Code, Cursor, VS Code (GitHub Copilot)
[!NOTE] It definitely works with other clients, but I haven't personally tested them yet. Please open an issue if you try it elsewhere!
Table of contents
Overview
Gemini Researcher accepts research-style queries over the MCP protocol and spawns the Gemini CLI in headless, read-only mode to perform large-context analysis on local files referenced with @path. Results are returned as pretty-printed JSON strings suitable for programmatic consumption by agent clients.
Prerequisites
- Node.js 18+ installed
- Gemini CLI installed:
npm install -g @google/gemini-cli - Gemini CLI authenticated (recommended:
gemini→ Login with Google) or setGEMINI_API_KEY
Quick checks:
node --version
gemini --version
Quickstart
Step 1: Validate environment
Run the setup wizard to verify Gemini CLI is installed and authenticated:
npx gemini-researcher init
Step 2: Configure your MCP client
Standard config works in most of the tools:
{
"mcpServers": {
"gemini-researcher": {
"command": "npx",
"args": [
"gemini-researcher"
]
}
}
}
VS Code
Add to your VS Code MCP settings (create .vscode/mcp.json if needed):
{
"servers": {
"gemini-researcher": {
"command": "npx",
"args": [
"gemini-researcher"
]
}
}
}
Claude Code
Option 1: Command line (recommended)
Local (user-wide) scope
# Add the MCP server via CLI
claude mcp add --transport stdio gemini-researcher -- npx gemini-researcher
# Verify it was added
claude mcp list
Project scope
Navigate to your project directory, then run:
# Add the MCP server via CLI
claude mcp add --scope project --transport stdio gemini-researcher -- npx gemini-researcher
# Verify it was added
claude mcp list
Option 2: Manual configuration
Add to .mcp.json in your project root (project scope):
{
"mcpServers": {
"gemini-researcher": {
"command": "npx",
"args": [
"gemini-researcher"
]
}
}
}
Or add to ~/.claude/settings.json for local scope.
After adding the server, restart Claude Code and use /mcp to verify the connection.
Cursor
Go to Cursor Settings -> Tools & MCP -> Add a Custom MCP Server. Add the following configuration:
{
"mcpServers": {
"gemini-researcher": {
"type": "stdio",
"command": "npx",
"args": [
"gemini-researcher"
]
}
}
}
[!NOTE] The server automatically uses the directory where the IDE opened your workspace as the project root or where your terminal is. To analyze a different directory, optionally set
PROJECT_ROOT:
Example
{
"mcpServers": {
"gemini-researcher": {
"command": "npx",
"args": [
"gemini-researcher"
],
"env": {
"PROJECT_ROOT": "/path/to/your/project"
}
}
}
}
Step 3: Restart your MCP client
Step 4: Test it
Ask your agent: "Use gemini-researcher to analyze the project."
Tools
All tools return structured JSON (as MCP text content). Large responses are automatically chunked (~10KB per chunk) and cached for 1 hour.
| Tool | Purpose | When to use |
|---|---|---|
| quick_query | Fast analysis with flash model | Quick questions about specific files or small code sections |
| deep_research | In-depth analysis with pro model | Complex multi-file analysis, architecture reviews, security audits |
| analyze_directory | Map directory structure | Understanding unfamiliar codebases, generating project overviews |
| validate_paths | Pre-check file paths | Verify files exist before running expensive queries |
| health_check | Diagnostics | Troubleshooting server/Gemini CLI issues |
| fetch_chunk | Get chunked responses | Retrieve remaining parts of large responses |
Example workflows
Understanding a security vulnerability:
Agent: Use deep_research to analyze authentication flow across @src/auth and @src/middleware, focusing on security
Quick code explanation:
Agent: Use quick_query to explain the login flow in @src/auth.ts, be concise
Mapping an unfamiliar codebase:
Agent: Use analyze_directory on src/ with depth 3 to understand the project structure
Full tool schemas (for reference)
quick_query
{
"prompt": "Explain @src/auth.ts login flow",
"focus": "security",
"responseStyle": "concise"
}
deep_research
{
"prompt": "Analyze authentication across @src/auth and @src/middleware",
"focus": "architecture",
"citationMode": "paths_only"
}
analyze_directory
{
"path": "src",
"depth": 3,
"maxFiles": 200
}
validate_paths
{
"paths": ["src/auth.ts", "README.md"]
}
health_check
{
"includeDiagnostics": true
}
fetch_chunk
{
"cacheKey": "cache_abc123",
"chunkIndex": 2
}
Docker
A pre-built multi-platform Docker image is available on Docker Hub:
# Pull the image (works on Intel/AMD and Apple Silicon)
docker pull capybearista/gemini-researcher:latest
# Run the server (mount your project and provide API key)
docker run -i --rm \
-e GEMINI_API_KEY="your-api-key" \
-v /path/to/your/project:/workspace \
capybearista/gemini-researcher:latest
For MCP client configuration with Docker:
{
"mcpServers": {
"gemini-researcher": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-e", "GEMINI_API_KEY",
"-v", "/path/to/your/project:/workspace",
"capybearista/gemini-researcher:latest"
],
"env": {
"GEMINI_API_KEY": "your-api-key-here"
}
}
}
}
[!NOTE]
- The
-iflag is required for stdio transport- The container mounts your project to
/workspace(the project root)- Replace
/path/to/your/projectwith your actual project path- Replace
your-api-keywith your actual Gemini API key (this is required for Docker usage)
Troubleshooting (common issues)
GEMINI_CLI_NOT_FOUND: Install Gemini CLI:npm install -g @google/gemini-cliAUTH_MISSING: Rungemini, and authenticate or setGEMINI_API_KEY.gitignoreblocking files: Gemini respects.gitignoreby default; togglefileFiltering.respectGitIgnoreingemini /settingsif you intentionally want ignored files included (note: this changes Gemini behavior globally)PATH_NOT_ALLOWED: All@pathreferences must resolve inside the configured project root (process.cwd()by default). Usevalidate_pathsto pre-check paths.QUOTA_EXCEEDED: Server retries with fallback models; if all tiers are exhausted, reduce scope (usequick_query) or wait for quota reset.
Contributing
We welcome contributions! Please read the Contributing Guide to get started.
Quick links:
License
Made with ♡ for the AI-assisted dev community