The Problem with the Status Quo
Most AI systems don't have memory. They have a vector store.
The difference? A vector store is a very good index. You store text, convert it to numbers, and find similar text later. That works — for simple use cases.
But imagine your brain were a vector store. You could recall facts that sound similar to your question. But you wouldn't know when you learned them. You couldn't forget. You couldn't form connections between memories that seem unrelated on the surface. You couldn't sleep and consolidate what you've learned.
In short: you'd be a very good search engine. But not a thinking being.
What Neuroscience Teaches Us
Human memory isn't a single system. It's an ensemble of specialized subsystems working together. The key findings that shaped our architecture:
Endel Tulving (1972): The distinction between episodic memory (concrete experiences with time and place) and semantic memory (abstract knowledge). Your brain stores "I was in a meeting with Maria yesterday" differently from "Meetings should have an agenda."
Hermann Ebbinghaus (1885): The forgetting curve. Memories fade exponentially — but each review makes them more stable. This principle is the foundation of spaced repetition.
Stickgold & Walker (2013): Sleep consolidation. While you sleep, your brain replays the day's experiences and decides what gets transferred to long-term memory. Important connections are strengthened, irrelevant ones weakened.
Donald Hebb (1949): "Neurons that fire together, wire together." When two concepts are regularly activated together, their connection strengthens. The foundation of associative learning.
From Papers to Code
We translated these findings into twelve algorithms, available as @zensation/algorithms on npm:
Ebbinghaus Decay
Every memory has a strength that decays exponentially over time. But: each access strengthens it. The formula accounts for age, access frequency, and emotional significance.
Retention = e^(-t / (S * (1 + accessCount * 0.3)))
This means: important, frequently used memories persist. Irrelevant ones fade naturally. No manual cleanup needed.
Hebbian Dynamics
When you mention "React" and "Performance" in a conversation, the connection between these concepts strengthens in the knowledge graph. The more often they appear together, the stronger the association.
At the same time, we normalize connection strengths so no single concept dominates all others. And we apply decay — connections that haven't been activated in a long time gradually weaken.
FSRS (Free Spaced Repetition Scheduler)
Based on the SuperMemo algorithm, but modernized. FSRS plans when you should review knowledge to optimally anchor it in long-term memory. Each rating (1-5) adjusts the intervals.
Bayesian Confidence
Not all information is equally certain. When ZenBrain has a statement at 60% confidence and new evidence emerges, it updates confidence using Bayes' theorem. Including 95% confidence intervals — so you know how sure the system is.
The Seven Layers
These algorithms work together across seven layers:
| Layer | Function | Inspiration | |-------|----------|-------------| | Working Memory | Active focus (current task) | Baddeley & Hitch (1974) | | Short-Term Memory | Session context (last minutes) | George Miller (1956) | | Episodic Memory | Concrete experiences with context | Tulving (1972) | | Semantic Memory | Facts and concepts | Tulving (1972) | | Procedural Memory | Processes and workflows | Squire (1992) | | Prospective Memory | Future intentions | Einstein & McDaniel (1990) | | Long-Term Memory | Persistent, consolidated knowledge | Atkinson & Shiffrin (1968) |
The MemoryCoordinator orchestrates all layers: it automatically routes new information to the right layer, performs cross-layer recall, and triggers consolidation.
Sleep Consolidation: Our Unique Differentiator
No other AI memory system on the market has sleep consolidation. Not Mem0, not Zep, not Letta.
Why? Because it's hard to build and doesn't fit on a feature checklist. But it's crucial for memory quality.
Our Sleep Compute Engine:
- Selects memories for replay — based on emotionality, novelty, and relevance
- Simulates replay — strengthens connections between related memories
- Prunes weak connections — active forgetting as quality control
The result: a memory that gets better over time. Not because it stores more, but because it weighs more intelligently.
Open Source, Zero Dependencies
The entire algorithms library is open source as @zensation/algorithms (Apache 2.0). 12 algorithms, 179 tests, zero external dependencies. TypeScript-native.
Why zero dependencies? Because a memory system is the foundation of your application. It should be stable, auditable, and free from supply chain risks.
npm install @zensation/algorithms @zensation/core
What's Next
We're working on retention curve visualization — so you can see the Ebbinghaus curve of your memories as a chart. And on cross-context entity merging, which detects when the same person or concept appears across different contexts.
Neuroscience has much more to offer. We're translating it — one algorithm at a time.
For a deep dive into the technical architecture, read our post on the 7-Layer Memory Architecture. In the next post, we discuss a values decision: why we choose self-hosting over the cloud.
