Skip to content

The memory engine for AI that actually knows you.

Not another vector database. YantrikDB models how memory actually works — temporal decay, semantic consolidation, contradiction detection, and proactive triggers. One embedded engine. Five unified indexes. Zero servers.
pip install yantrikdbcargo add yantrikdb

Every AI memory solution does the same thing:

Store everything. Embed. Retrieve top-k. Inject into context. Hope it helps.

That doesn’t model how memory works. It treats all memories as equal. Old memories never fade. Contradictions are never detected. Nothing is ever consolidated. The AI never proactively remembers anything.

YantrikDB fixes all of this.

Relevance-Conditioned Scoring

Relevance gates every other signal multiplicatively. A perfectly relevant old memory surfaces. An irrelevant high-importance memory doesn’t. This is the key insight — patented and proven.

Cognitive State Graph

Typed nodes (beliefs, goals, intents, preferences) with typed edges (supports, contradicts, causes, predicts). Your AI doesn’t just remember — it reasons about what it knows.

Autonomous Cognition

Consolidation merges related memories. Conflict detection flags contradictions. Pattern mining discovers recurring themes. All automatic via db.think().

Proactive Triggers

Decaying memories, unresolved conflicts, emerging patterns — YantrikDB tells your AI when to act, grounded in real data. Not engagement farming.

Five Unified Indexes

Vector (HNSW), graph, temporal, decay heap, and key-value — all in one embedded SQLite database. No server. No infrastructure. Just a file.

MCP Server

pip install yantrikdb-mcp — instant persistent memory for Claude Code, Cursor, Windsurf, and any MCP-compatible AI agent.


from yantrikdb import YantrikDB
db = YantrikDB("memory.db", embedding_dim=384)
# Remember — with importance and emotional valence
db.record("User prefers Python over JavaScript", importance=0.7)
db.record("User is stressed about Friday's deadline", importance=0.9, valence=-0.6)
# Recall — relevance-conditioned scoring, not just cosine similarity
results = db.recall("What's bothering the user?", top_k=5)
# → Returns the deadline stress memory (high relevance × high importance × recent)
# Relate — build the cognitive graph
db.relate("user", "deadline", "stressed_about", weight=0.9)
# Think — autonomous cognition loop
result = db.think()
# → Consolidates similar memories, detects contradictions, mines patterns
# → Returns proactive triggers: "User's deadline is approaching, stress is high"

IndexWhat It DoesExample Query
Vector (HNSW)Semantic similarity search”What did the user say about work?”
GraphEntity relationships & reasoning”Who works at what company?”
TemporalTime-aware retrieval”What happened last Tuesday?”
Decay HeapImportance with biological time decayMemories fade like human memory
Key-ValueInstant fact lookup”User’s timezone is CST”

All five indexes query the same data. A single recall() call blends signals from all of them into one relevance-conditioned score.


Vector DBRAG PipelineYantrikDB
StorageFlat embeddingsChunked documentsTyped memories with metadata
RetrievalCosine top-kHybrid searchRelevance-conditioned scoring
TimeIgnoredIgnoredTemporal decay + recency
ContradictionsUndetectedUndetectedAutomatic conflict detection
ConsolidationNoneNoneAutonomous merging
ProactiveNeverNeverTrigger-based notifications
GraphSeparate systemNoneBuilt-in cognitive state graph

U.S. Patent Application No. 19/573,392 (filed March 2026) — covers relevance-conditioned scoring, the cognitive state graph, and the unified system architecture.

Open source under AGPL-3.0. The patent protects the methods, not the code. Use it freely. Read more →


ComponentDescriptionLicense
YantrikDBCognitive memory engineAGPL-3.0
YantrikDB MCPMCP server for AI agentsMIT
Yantrik MLPluggable AI inference — LLM, embeddings, vision, TTSAGPL-3.0
Yantrik CompanionAI agent with instincts, tools, and personalityAGPL-3.0
Yantrik OSAI-native desktop operating system (Rust + Slint)AGPL-3.0

Built by one person. All open source. Get started →