MCP Hub
Back to servers

Zikkaron

Biologically-inspired persistent memory engine for Claude Code MCP agents

Registry
Stars
4
Updated
Mar 3, 2026
Validated
Mar 5, 2026

Quick Install

uvx zikkaron

Zikkaron

PyPI Python 3.11+ Tests License: MIT

Persistent memory for Claude Code. 26 cognitive subsystems, 18 MCP tools, runs locally on SQLite.

The Problem

Every time you start a new Claude Code session, it forgets everything. Architecture decisions, debugging history, project conventions, file patterns you explained three times already — all gone. You end up re-explaining your entire codebase from scratch.

Zikkaron fixes this. It gives Claude Code a persistent memory that survives across sessions, consolidates over time, and surfaces the right context when you need it.

Quick Start

pip install zikkaron

Add to your Claude Code MCP config (~/.claude/settings.json):

{
  "mcpServers": {
    "zikkaron": {
      "command": "zikkaron"
    }
  }
}

Done. Claude Code now has persistent memory.

Make Claude use it automatically

Add this to your project's CLAUDE.md:

## Memory — Zikkaron
- On every new session, call `recall` with the current project name to load prior context
- Before starting any task, call `get_project_context` for the current working directory
- After completing any significant task, call `remember` to store what was done, decisions made, and outcomes

What It Looks Like

Session 1 — You're debugging a tricky auth issue:

Tool: remember
  content: "Auth tokens expire silently when Redis cache is cold-started.
            Fix: added token refresh middleware in auth/middleware.py.
            Root cause was TTL mismatch between Redis and JWT expiry."
  context: "myapp backend debugging"
  tags: ["auth", "redis", "debugging"]

Session 2 — Days later, a related bug appears. Claude automatically recalls:

Tool: recall
  query: "authentication token issues"

> Memory #42 (heat: 0.87): Auth tokens expire silently when Redis cache
> is cold-started. Fix: added token refresh middleware in auth/middleware.py.
> Root cause was TTL mismatch between Redis and JWT expiry.

No re-explaining. No digging through old conversations. It just remembers.

How It Works

Zikkaron isn't a text file that gets loaded on startup. It's a memory engine built on computational neuroscience:

  • Predictive coding write gate — Only stores what's actually new. Redundant information is filtered at ingest.
  • Heat-based salience — Frequently accessed memories stay hot. Unused ones decay naturally, like biological memory.
  • Sleep consolidation — Background process replays memories, discovers cross-project connections, and compresses old knowledge.
  • Reconsolidation — Memories update when retrieved in new contexts, staying accurate as your codebase evolves.
  • Fractal hierarchy — Memories cluster into summaries at multiple scales. Drill down from high-level architecture to specific implementation details.
  • Knowledge graph — Entities and relationships are extracted and linked. Personalized PageRank surfaces contextually relevant memories.
  • Causal discovery — Learns cause-effect relationships from your coding sessions using the PC algorithm.
  • Successor representations — Memories that co-occur in similar contexts cluster together, even when their content differs.

All data stays on your machine in a single SQLite database. No cloud, no API calls, no telemetry.

MCP Tools

Zikkaron exposes 18 tools over MCP:

ToolWhat it does
rememberStore a memory (passes through the predictive coding write gate)
recallSemantic + keyword search with heat-weighted ranking
forgetDelete a memory
validate_memoryCheck if a memory is still valid against current file state
get_project_contextGet all active memories for a directory
consolidate_nowTrigger a consolidation cycle
memory_statsSystem statistics
rate_memoryGive usefulness feedback for metamemory tracking
recall_hierarchicalQuery the fractal hierarchy at a specific level
drill_downNavigate into a memory cluster
create_triggerSet a prospective trigger that fires on matching context
get_project_storyGet the autobiographical narrative for a project
add_ruleDefine neuro-symbolic rules for filtering/re-ranking
get_rulesList active rules
navigate_memoryTraverse concept space using successor representations
get_causal_chainGet causal ancestors/descendants for an entity
assess_coverageEvaluate knowledge coverage with gap identification
detect_gapsFind knowledge gaps: isolated entities, stale regions, missing connections

Architecture

Zikkaron runs as a local MCP server. All data stays on your machine in a single SQLite database with WAL mode, FTS5 full-text search, and sqlite-vec for vector similarity.

26 subsystems organized into five tiers:

Core Storage and Retrieval
ModuleRole
storage.pySQLite WAL engine with 15 tables, FTS5 indexing, sqlite-vec ANN search
embeddings.pySentence-transformer encoding (all-MiniLM-L6-v2) with batched operations
retrieval.pyMulti-signal fusion retriever combining vector similarity, FTS5 BM25, knowledge graph PPR, spreading activation, and fractal hierarchy traversal
models.pyPydantic data models for memories, entities, relationships, clusters, rules, and causal edges
config.pyEnvironment-based configuration with ZIKKARON_ prefix
Memory Dynamics
ModuleRole
thermodynamics.pyHeat-based memory salience. Surprise scoring, importance heuristics, emotional valence, and temporal decay govern which memories stay accessible
reconsolidation.pyMemories become labile on retrieval and are rewritten based on context mismatch magnitude. Implements the Nader et al. (2000) reconsolidation model with three outcomes: reinforcement, modification, or archival
predictive_coding.pyWrite gate that only stores prediction errors. Maintains a generative model per directory context and computes surprisal against existing knowledge — redundant information is filtered at ingest
engram.pyCompetitive memory slot allocation based on CREB-like excitability (Josselyn & Frankland, 2007). High-excitability slots win allocation; temporally proximate memories share engram slots
compression.pyRate-distortion optimal forgetting (Toth et al., 2020). Memories degrade progressively: full fidelity at 0-7 days, gist compression at 7-30 days, semantic tag extraction beyond 30 days
staleness.pyFile-change watchdog using SHA-256 hashing to detect when source code has diverged from stored memories
Consolidation and Organization
ModuleRole
consolidation.pyBackground astrocyte daemon running periodic consolidation cycles: decay application, staleness checks, prospective trigger evaluation
astrocyte_pool.pyDomain-specialized consolidation processes (code structure, architectural decisions, error patterns, dependency tracking) running as a worker pool
sleep_compute.pyOffline "dream replay" that replays memory pairs to discover cross-project connections, runs Louvain community detection for clustering, and performs temporal compression
fractal.pyHierarchical multi-scale memory tree. Memories cluster at leaf level; clusters merge into intermediate summaries; summaries merge into root abstractions. Supports drill-down navigation
cls_store.pyComplementary Learning Systems (McClelland et al., 1995). Dual-store architecture: fast episodic capture in a hippocampal buffer, slow semantic abstraction in a neocortical store with periodic interleaved replay
Knowledge Structure
ModuleRole
knowledge_graph.pyTyped entity-relationship graph with co-occurrence, causal, and temporal edges. Supports Personalized PageRank for contextual retrieval
causal_discovery.pyPC algorithm (Spirtes, Glymour, Scheines, 2000) for discovering causal DAGs from coding session event logs. Conditional independence testing via partial correlation
cognitive_map.pySuccessor Representation (Stachenfeld et al., 2017) for navigation-based retrieval. Memories that co-occur in similar contexts cluster in SR space, enabling associative traversal even when content differs
narrative.pyAutobiographical project stories synthesized from memory timelines, key decisions, and significant events
curation.pyAutomated memory maintenance: duplicate merging, contradiction detection, cross-reference linking
Frontier Capabilities
ModuleRole
hopfield.pyModern continuous Hopfield networks (Ramsauer et al., 2021). Energy-based associative retrieval equivalent to transformer attention: softmax(beta * X^T * query)
hdc_encoder.pyHyperdimensional Computing / Vector Symbolic Architecture (Kanerva, 1988). Encodes memories as role-filler bindings in 10,000-dimensional bipolar space for structured queries
metacognition.pySelf-assessment of knowledge coverage. Gap detection across five dimensions: isolated entities, stale regions, low-confidence zones, missing connections, one-sided knowledge
rules_engine.pyNeuro-symbolic constraints. Hard rules (must satisfy) and soft rules (preference boosts/penalties) scoped to global, directory, or file level
crdt_sync.pyMulti-agent memory sharing via CRDTs (OR-Set for collections, LWW-Register for content, G-Counter for access counts). Automatic conflict resolution across agent instances
prospective.pyFuture-oriented triggers that fire when matching context is detected — directory, keyword, entity, or time-based conditions
sensory_buffer.pyEpisodic capture buffer for raw session content with configurable token windows and overlap

Advanced Setup

From source

git clone https://github.com/amanhij/Zikkaron.git
cd Zikkaron
pip install -e .

SSE transport

For running as a persistent background server instead of stdio:

zikkaron --transport sse

Then configure Claude Code to connect via URL:

{
  "mcpServers": {
    "zikkaron": {
      "type": "sse",
      "url": "http://127.0.0.1:8742/sse"
    }
  }
}

Default port: 8742. Override with --port. Database defaults to ~/.zikkaron/memory.db, override with --db-path.

Configuration

All settings are configurable via environment variables with the ZIKKARON_ prefix:

VariableDefaultDescription
ZIKKARON_PORT8742Server port
ZIKKARON_DB_PATH~/.zikkaron/memory.dbDatabase location
ZIKKARON_EMBEDDING_MODELall-MiniLM-L6-v2Sentence-transformer model
ZIKKARON_DECAY_FACTOR0.95Base heat decay per consolidation cycle
ZIKKARON_COLD_THRESHOLD0.05Heat below which memories are candidates for archival
ZIKKARON_WRITE_GATE_THRESHOLD0.4Minimum surprisal to pass the predictive coding write gate
ZIKKARON_HOPFIELD_BETA8.0Hopfield network sharpness parameter
ZIKKARON_SR_DISCOUNT0.9Successor representation discount factor
ZIKKARON_COGNITIVE_LOAD_LIMIT4Maximum chunks in active context (Cowan's 4 +/- 1)

See zikkaron/config.py for the full list.

Testing

python -m pytest zikkaron/tests/ -x -q

891 tests across 33 test files covering all subsystems.

References

Academic papers and books that informed the implementation
  • Ramsauer et al. "Hopfield Networks is All You Need" (ICLR 2021, arXiv:2008.02217)
  • Nader, Schafe, LeDoux. "Fear memories require protein synthesis in the amygdala for reconsolidation after retrieval" (Nature 406, 2000)
  • Osan, Tort, Bhatt, Bhatt, Bhatt, Amaral. "Three outcomes of reconsolidation" (PLoS ONE, 2011)
  • McClelland, McNaughton, O'Reilly. "Why there are complementary learning systems in the hippocampus and neocortex" (Psych. Review 102, 1995)
  • Sun et al. "Organizing memories for generalization in complementary learning systems" (Nature Neuroscience 26, 2023)
  • Stachenfeld, Botvinick, Gershman. "The hippocampus as a predictive map" (Nature Neuroscience 20, 2017)
  • Whittington et al. "The Tolman-Eichenbaum Machine" (Cell 183, 2020)
  • Spirtes, Glymour, Scheines. Causation, Prediction, and Search (MIT Press, 2000)
  • Kanerva. Sparse Distributed Memory (MIT Press, 1988)
  • Frady, Kleyko, Sommer. "Variable Binding for Sparse Distributed Representations" (IEEE TNNLS, 2022)
  • Toth et al. "Optimal forgetting via rate-distortion theory" (PLoS Computational Biology, 2020)
  • Josselyn, Frankland. "Memory allocation: mechanisms and function" (Annual Review Neuroscience 41, 2018)
  • Rashid et al. "Competition between engrams influences fear memory formation and recall" (Science 353, 2016)
  • Zhou et al. "MetaRAG: Metacognitive Retrieval-Augmented Generation" (ACM Web, 2024)

License

MIT

Reviews

No reviews yet

Sign in to write a review