MCP Hub
Back to servers

Memory MCP Server

SQLite-backed MCP server for persistent memory, full-text retrieval, and graph traversal.

Registry
Updated
Feb 25, 2026

Quick Install

npx -y @j0hanz/memory-mcp

Memory MCP

npm version License: MIT Node.js TypeScript

Install in VS Code Install in VS Code Insiders

Install in Cursor

A SQLite-backed MCP server for persistent memory storage, full-text retrieval, and relationship graph traversal.

Overview

Memory MCP provides a local, persistent memory layer for MCP-enabled assistants. It stores SHA-256-addressed memory items in SQLite with FTS5-powered full-text search, a directed relationship graph, BFS recall traversal, and token-budget-aware context retrieval — all accessible over stdio transport with no external dependencies.

Key Features

  • 13 MCP tools for CRUD, batch operations, FTS5 search, BFS graph recall, token-budget context retrieval, relationships, and stats.
  • Full-text search over content and tags via SQLite FTS5 with importance and type filters.
  • Graph recall with BFS traversal, bounded frontier, and MCP progress notifications per hop.
  • Token-budget retrieval (retrieve_context) selects memories that fit a caller-specified token budget — no manual pagination needed.
  • Strict Zod input validation with typed output envelopes and SHA-256 hash addressing.
  • Resource support with internal://instructions (Markdown guide) and memory://memories/{hash} URI template with hash auto-completion.
  • stdio transport with clean shutdown handling (SIGINT, SIGTERM) and no HTTP endpoints. |

Requirements

  • Node.js >=24.
  • SQLite with FTS5 support (verified at startup).
  • Any MCP client that supports stdio command servers.

Quick Start

Use the npm package directly with npx — no installation required:

{
  "mcpServers": {
    "memory-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/memory-mcp@latest"]
    }
  }
}

[!TIP] The server uses stdio transport only; no HTTP endpoint is exposed. Stdout must not be polluted by custom logging.

Or run with Docker:

docker run --rm -i ghcr.io/j0hanz/memory-mcp:latest

Client Configuration

Install in VS Code

Install in VS Code

Workspace file .vscode/mcp.json:

{
  "servers": {
    "memory-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/memory-mcp@latest"]
    }
  }
}

CLI:

code --add-mcp '{"name":"memory-mcp","command":"npx","args":["-y","@j0hanz/memory-mcp@latest"]}'
Install in VS Code Insiders

Install in VS Code Insiders

CLI:

code-insiders --add-mcp '{"name":"memory-mcp","command":"npx","args":["-y","@j0hanz/memory-mcp@latest"]}'
Install in Cursor

Install in Cursor

~/.cursor/mcp.json:

{
  "mcpServers": {
    "memory-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/memory-mcp@latest"]
    }
  }
}
Install in Claude Desktop / Claude Code

claude_desktop_config.json:

{
  "mcpServers": {
    "memory-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/memory-mcp@latest"]
    }
  }
}

CLI:

claude mcp add memory-mcp -- npx -y @j0hanz/memory-mcp@latest
Install in Windsurf

MCP config:

{
  "mcpServers": {
    "memory-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/memory-mcp@latest"]
    }
  }
}
Run with Docker
# Pull and run (stdio mode)
docker run --rm -i \
  -e MEMORY_DB_PATH=/data/memory.db \
  -v memory-data:/data \
  ghcr.io/j0hanz/memory-mcp:latest

MCP client config:

{
  "mcpServers": {
    "memory-mcp": {
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "-i",
        "-e",
        "MEMORY_DB_PATH=/data/memory.db",
        "-v",
        "memory-data:/data",
        "ghcr.io/j0hanz/memory-mcp:latest"
      ]
    }
  }
}

Documentation Maintenance

  • Owner: maintainers updating MCP behavior in src/ must update README.md and affected mcp/ mirror pages in the same PR.
  • Link/version policy: use pinned https://modelcontextprotocol.io/specification/2025-11-25/... links for protocol references; avoid latest and mixed legacy targets.
  • Drift-check checklist:
    • Re-verify capability declarations in src/server.ts.
    • Reconcile tool/resource/prompt docs with src/tools/index.ts, src/resources/index.ts, and src/prompts/index.ts.
    • Confirm limitations/gotchas in src/instructions.md match runtime behavior.
  • Validation commands: npm run type-check, npm run test:fast, npm run build.

MCP Surface

Tools Summary

ToolCategoryNotes
store_memoryWriteIdempotent by content+sorted tags hash
store_memoriesWriteBatch (1–50), transaction-wrapped
get_memoryReadHash lookup
update_memoryWriteReturns old_hash + new_hash
delete_memoryWriteCascades relationship deletion
delete_memoriesWriteBatch (1–50), transaction-wrapped
search_memoriesReadFTS5 + importance/type filters + cursor
create_relationshipWriteIdempotent directed edge creation
delete_relationshipWriteDeletes exact directed edge
get_relationshipsReadDirection filter + linked memory fields
recallReadFTS5 seed + BFS traversal (depth 0–3)
retrieve_contextReadToken-budget-aware context retrieval
memory_statsReadStore aggregates and type breakdown

store_memory

Store a new memory with content, tags, and optional type/importance. Idempotent — storing the same content+tags returns the existing hash with created: false.

NameTypeRequiredDefaultDescription
contentstringYesMemory content (1–100000 chars)
tagsstring[]Yes1–100 tags, each max 50 chars, no whitespace
memory_typeenumNogeneralgeneral, fact, plan, decision, reflection, lesson, error, gradient
importanceintegerNo0Priority 0–10

Returns: { hash, created }


store_memories

Store multiple memories in one transaction (max 50 items).

NameTypeRequiredDescription
itemsArray<StoreMemoryItem>Yes1–50 items, each with content, tags, optional memory_type, optional importance

Returns: { items, succeeded, failed }


get_memory

Retrieve one memory by its SHA-256 hash.

NameTypeRequiredDescription
hashstringYes64-char lowercase SHA-256 hex

Returns: Memory or { ok: false, error } on E_NOT_FOUND.


update_memory

Update content and optionally tags for an existing memory. Returns both hashes.

NameTypeRequiredDefaultDescription
hashstringYesExisting memory hash
contentstringYesReplacement content
tagsstring[]NoExisting tagsReplacement tags

Returns: { old_hash, new_hash }


delete_memory

Delete one memory by hash. Cascades to related relationship rows.

NameTypeRequiredDescription
hashstringYesMemory hash

Returns: { hash, deleted }


delete_memories

Delete multiple memories by hash in one transaction.

NameTypeRequiredDescription
hashesstring[]Yes1–50 memory hashes

Returns: { items, succeeded, failed }


search_memories

Full-text search over memory content and tags using FTS5. Supports importance and type filters with cursor pagination.

NameTypeRequiredDefaultDescription
querystringYesSearch text (1–1000 chars)
limitintegerNo20Results per page (1–100)
cursorstringNoPagination cursor from previous response
min_importanceintegerNoOnly return memories with importance >= this value (0–10)
max_importanceintegerNoOnly return memories with importance <= this value (0–10)
memory_typeenumNoFilter by memory type

Returns: { memories, total_returned, nextCursor? }


create_relationship

Create a directed relationship edge between two memories. Idempotent.

Suggested relation_type values: related_to, causes, depends_on, parent_of, child_of, supersedes, contradicts, supports, references.

NameTypeRequiredDescription
from_hashstringYesSource memory hash
to_hashstringYesTarget memory hash
relation_typestringYesEdge label (1–50 chars, no whitespace, free-form)

Returns: { created }


delete_relationship

Delete one directed relationship edge.

NameTypeRequiredDescription
from_hashstringYesSource hash
to_hashstringYesTarget hash
relation_typestringYesRelationship type

Returns: { deleted } or { ok: false, error } on E_NOT_FOUND.


get_relationships

Retrieve relationships for a memory, with optional direction filter.

NameTypeRequiredDefaultDescription
hashstringYesMemory hash
directionenumNobothoutgoing, incoming, or both

Returns: { relationships, count }

Each relationship includes from_hash, to_hash, relation_type, created_at, linked_hash, linked_content, and linked_tags.


recall

Search memories by full-text query, then traverse the relationship graph up to depth hops via BFS. Emits MCP progress notifications per hop.

NameTypeRequiredDefaultDescription
querystringYesSeed search query (1–1000 chars)
depthintegerNo1BFS hops (0–3)
limitintegerNo10Seed memory count (1–50)
cursorstringNoPagination cursor from previous response
min_importanceintegerNoSeed filter: only memories with importance >= value (0–10)
max_importanceintegerNoSeed filter: only memories with importance <= value (0–10)
memory_typeenumNoSeed filter: only memories of this type

Returns: { memories, graph, depth_reached, aborted?, nextCursor? }

Each item in graph uses the shape:

{ "from_hash": "...", "to_hash": "...", "relation_type": "..." }

[!NOTE] aborted: true indicates the traversal hit a safety limit (RECALL_MAX_FRONTIER_SIZE, RECALL_MAX_EDGE_ROWS, or RECALL_MAX_VISITED_NODES). Partial results are still returned.


retrieve_context

Search memories and return relevance-ranked results that fit within a caller-specified token budget. Eliminates manual pagination and token counting for context window management.

NameTypeRequiredDefaultDescription
querystringYesSearch query (1–1000 chars)
token_budgetintegerNo4000Maximum estimated tokens to return (100–200000)
strategyenumNorelevanceSort order: relevance (FTS rank), importance (highest first), recency (newest first)

Returns: { memories, estimated_tokens, truncated }

[!TIP] Token estimation is approximate (content length ÷ 4). truncated: true means the budget was reached before all candidates were included.


memory_stats

Return aggregate memory and relationship stats. Takes no input.

Returns:

{
  "memories": {
    "total": 0,
    "oldest": null,
    "newest": null,
    "avg_importance": null
  },
  "relationships": { "total": 0 },
  "by_type": {}
}

Resources

URIMIMEDescription
internal://instructionstext/markdownMarkdown usage guide for all tools and workflows
memory://memories/{hash}application/jsonReturns one memory as JSON; hash completion supported

Prompts

NameArgumentsPurpose
get-helpnoneReturns full usage instructions for all tools

Configuration

Environment Variables

VariableDescriptionDefaultRequired
MEMORY_DB_PATHSQLite database file pathmemory_db/memory.dbNo
RECALL_MAX_FRONTIER_SIZEMax BFS frontier nodes per hop (100–50000)1000No
RECALL_MAX_EDGE_ROWSMax relationship rows fetched per traversal (100–50000)5000No
RECALL_MAX_VISITED_NODESMax visited nodes across entire traversal (100–50000)5000No

[!IMPORTANT] If MEMORY_DB_PATH is relative (including the default memory_db/memory.db), it resolves from the process working directory.

[!TIP] Add memory_db/ to your .gitignore to keep the database out of version control — it contains local session data and should not be shared or committed.

Limits and Constraints

ItemValue
Content length1–100000 chars
Tag count1–100 per memory
Tag length1–50 chars, no whitespace
Hash format64-char lowercase hex SHA-256
Search query length1–1000 chars
search_memories.limit1–100 (default 20)
recall.depth0–3 (default 1)
recall.limit1–50 (default 10)
retrieve_context.token_budget100–200000 (default 4000)
Batch size1–50 items (store_memories, delete_memories)
Recall frontier guardRECALL_MAX_FRONTIER_SIZE (default 1000 per hop)
SQLite busy timeout5000 ms

[!NOTE] Cursor values are opaque base64url-encoded tokens. Treat them as opaque and do not parse them.

Security

  • Transport is stdio-only (StdioServerTransport) — no HTTP endpoints.
  • Fatal process errors are written to stderr; stdout must remain clean for the MCP protocol.
  • All inputs are validated with strict Zod schemas and bounded field constraints before any database access.
  • Hashes are validated against a lowercase 64-char SHA-256 hex regex.
  • Search input is tokenized to alphanumeric terms before FTS MATCH execution (non-alphanumeric characters act as delimiters, preventing FTS injection).
  • SQLite foreign keys are enabled; relationship rows cascade-delete when a memory is removed.

Development

Install dependencies:

npm install

Core scripts:

ScriptCommandPurpose
buildnpm run buildClean, compile, validate instructions, copy assets, chmod executable
devnpm run devTypeScript watch mode
dev:runnpm run dev:runRun built server with .env and file watch
startnpm run startStart built server
testnpm run testFull build + tests via task runner
test:fastnpm run test:fastRun TS tests directly with Node test runner
lintnpm run lintESLint checks
lint:fixnpm run lint:fixESLint auto-fix
type-checknpm run type-checkStrict TypeScript checks
formatnpm run formatPrettier format
inspectornpm run inspectorBuild and open MCP Inspector against stdio server

Inspect with MCP Inspector:

npx @modelcontextprotocol/inspector node dist/index.js

Build & Release

GitHub Actions release workflow (.github/workflows/release.yml) handles versioning, validation, and publishing via a single workflow_dispatch trigger:

workflow_dispatch (patch / minor / major / custom)
    │
    ▼
  release — bump package.json + server.json → lint → type-check → test → build → tag → GitHub Release
    │
    ├──► publish-npm ──► publish-mcp   (npm Trusted Publishing OIDC → MCP Registry)
    │
    └──► publish-docker                (GHCR, linux/amd64 + linux/arm64)

Trigger a release:

gh workflow run release.yml -f bump=patch

Or use the GitHub UI: Actions → Release → Run workflow.

[!NOTE] npm publishing uses OIDC Trusted Publishing — no NPM_TOKEN secret required. MCP Registry uses GitHub OIDC. Docker uses the built-in GITHUB_TOKEN.

Troubleshooting

SymptomCauseFix
Startup fails with FTS5 errorNode.js build without FTS5Use Node.js 24+ with SQLite FTS5 support
E_NOT_FOUND on get_memoryHash doesn't existVerify via search_memories first
E_INVALID_CURSORStale or malformed cursorRetry the request without the cursor parameter
MCP client can't connectCustom stdout logging addedEnsure nothing writes to stdout in the server process
aborted: true in recallTraversal hit a safety limitReduce depth, or tune RECALL_MAX_* env vars
Database locked errorsHigh concurrent write loadSQLite busy timeout is 5000 ms; reduce concurrent writes

License

MIT

Reviews

No reviews yet

Sign in to write a review