MCP Hub
Back to servers

Context-Engine

A self-improving code search and context compression suite that provides hybrid semantic/lexical search, precise micro-chunking, and resident memory for AI agents.

Stars
276
Forks
33
Tools
11
Updated
Jan 23, 2026
Validated
Jan 24, 2026

CI npm version VS Code Marketplace Join our Discord

Documentation: Getting Started · README · Configuration · IDE Clients · MCP API · ctx CLI · Memory Guide · Architecture · Multi-Repo · Observability · Kubernetes · VS Code Extension · Troubleshooting · Development


Context-Engine

Open-core, self-improving code search that gets smarter every time you use it.

Context-Engine Usage


Quick Start: Stack in 30 Seconds

VS Code Extension (Easiest)

  1. Install Context Engine Uploader
  2. Open any project → extension prompts to set up Context-Engine stack
  3. Opened workspace is indexed
  4. MCP configs can configure your agent/IDE

That's it! The extension handles everything:

  • Clones Context-Engine to your chosen location (keeps it separate from your project)
  • Starts the Docker stack automatically
  • Sets up MCP bridge configuration
  • Writes MCP configs for Claude Code, Windsurf, and Augment

Claude Code users: Install the skill plugin:

/plugin marketplace add m1rl0k/Context-Engine
/plugin install context-engine

Manual Setup (Alternative)

git clone https://github.com/m1rl0k/Context-Engine.git && cd Context-Engine
make bootstrap  # One-shot: up → wait → index → warm → health

Or step-by-step:

docker compose up -d
HOST_INDEX_PATH=/path/to/your/project docker compose run --rm indexer

See Configuration for environment variables and IDE_CLIENTS.md for MCP setup.


Why This Stack Works Better

ProblemContext-Engine Solution
Large file chunks → returns entire filesPrecise spans: Returns 5-50 line chunks, not whole files
Lost context → missing relevant codeHybrid search: Semantic + lexical + cross-encoder reranking
Cloud dependency → vendor lock-inLocal stack: Docker Compose on your machine
Static knowledge → never improvesAdaptive learning: Gets smarter with every use
Tool limits → only works in specific IDEsMCP native: Works with any MCP-compatible tool

What You Get Out of the Box

  • ReFRAG-inspired micro-chunking: Research-grade precision retrieval
  • Self-hosted stack: No cloud dependency, no vendor lock-in
  • Universal compatibility: Claude Code, Windsurf, Cursor, Cline, etc.
  • Auto-syncing: Extension watches for changes and re-indexes automatically
  • Memory system: Store team knowledge alongside your code
  • Optional LLM features: Local decoder (llama.cpp), cloud integration (GLM, MiniMax), adaptive rerank learning

Works With Your Local Files

No complicated path setup - Context-Engine automatically handles the mapping between your local files and the search index.

Enterprise-Ready Features

  • Built-in authentication with session management (optional)
  • Unified MCP endpoint that combines indexer and memory services
  • Automatic collection injection for workspace-aware queries

Alternative: Direct HTTP endpoints

{
  "mcpServers": {
    "qdrant-indexer": { "url": "http://localhost:8003/mcp" },
    "memory": { "url": "http://localhost:8002/mcp" }
  }
}

Using other IDEs? See docs/IDE_CLIENTS.md for complete MCP configuration examples.


Supported Clients

ClientTransport
Claude CodeSSE / RMCP
CursorSSE / RMCP
WindsurfSSE / RMCP
ClineSSE / RMCP
RooSSE / RMCP
OpenCodeRMCP
AugmentSSE
CodexRMCP
CopilotRMCP
AmpCodeRMCP
KiroRMCP
AntigravityRMCP
ZedSSE (via mcp-remote)

Endpoints

ServiceURL
Indexer MCP (SSE)http://localhost:8001/sse
Indexer MCP (RMCP)http://localhost:8003/mcp
Memory MCP (SSE)http://localhost:8000/sse
Memory MCP (RMCP)http://localhost:8002/mcp
Qdranthttp://localhost:6333
Upload Servicehttp://localhost:8004

VS Code Extension

Context Engine Uploader provides:

  • One-click upload — Sync workspace to Context-Engine
  • Auto-sync — Watch for changes and re-index automatically
  • Prompt+ button — Enhance prompts with code context before sending
  • MCP auto-config — Writes Claude/Windsurf MCP configs

See docs/vscode-extension.md for full documentation.


MCP Tools

Search (Indexer MCP):

  • repo_search — Hybrid code search with filters
  • context_search — Blend code + memory results
  • context_answer — LLM-generated answers with citations
  • search_tests_for, search_config_for, search_callers_for

Memory (Memory MCP):

  • store — Save knowledge with metadata
  • find — Retrieve stored memories

Indexing:

  • qdrant_index_root — Index the workspace
  • qdrant_status — Check collection health
  • qdrant_prune — Remove stale entries

See docs/MCP_API.md for complete API reference.


Documentation

GuideDescription
Getting StartedVS Code + dev-remote walkthrough
IDE ClientsConfig examples for all supported clients
ConfigurationEnvironment variables reference
MCP APIFull tool documentation
ArchitectureSystem design
Multi-RepoMultiple repositories in one collection
KubernetesProduction deployment

How It Works

flowchart LR
  subgraph Your Machine
    A[IDE / AI Tool]
    V[VS Code Extension]
  end
  subgraph Docker
    U[Upload Service]
    I[Indexer MCP]
    M[Memory MCP]
    Q[(Qdrant)]
    L[[LLM Decoder]]
    W[[Learning Worker]]
  end
  V -->|sync| U
  U --> I
  A -->|MCP| I
  A -->|MCP| M
  I --> Q
  M --> Q
  I -.-> L
  I -.-> W
  W -.-> Q

Language Support

Python, TypeScript/JavaScript, Go, Java, Rust, C#, PHP, Shell, Terraform, YAML, PowerShell


Benchmarks

CoSQA (Dense Retrieval, No Rerank)

MethodMRRR@1R@5R@10NDCG@10
Context-Engine (Jina-Code)0.2760.1460.4480.6580.365
Context-Engine (BGE-base)0.2530.1500.3740.5500.322
CodeT5+ embedding0.266----
BM25 (Lucene)0.167----
BoW0.065----

Corpus: 20,604 code snippets | 500 queries | Pure dense retrieval, no reranking Jina-Code: jinaai/jina-embeddings-v2-base-code (code-specific, 8k context)

CoIR Benchmark (Full Corpus, Dense Retrieval)

BenchmarkCorpusQueriesNDCG@10
CodeSearchNet-Python280K14.9K74.37%
CodeSearchNet-Go280K14.9K74.51%
CodeSearchNet-JavaScript280K14.9K57.19%

Full CoIR corpus evaluation with dense retrieval (Jina-Code embeddings)


License

BUSL-1.1

Reviews

No reviews yet

Sign in to write a review