All posts

Why we built Kronroe

AI agents need memory that understands time. Not a key-value store bolted to a vector database — a purpose-built engine where every fact carries four timestamps.

↳ the memory problem nobody talks about

Your agent says “Alice works at Stripe.” But Alice left six months ago. The old value was overwritten. There’s no history, no correction trail, no way to ask “what did we believe last Tuesday?”

This is the memory problem nobody talks about. Not retrieval quality. Not embedding models. The problem is that most agent memory systems treat facts like cache entries — mutable, ephemeral, and fundamentally timeless.

The overwrite trap

Most memory solutions for AI agents follow the same pattern: store a fact, and when it changes, overwrite it. A key-value store. A vector database with upsert. A graph database where you mutate the edge.

This works fine until it doesn’t. Consider an agent managing a sales pipeline:

If you overwrote Monday’s fact on Thursday, you can’t answer Friday’s question. The information is gone. Not archived, not versioned — just gone.

↳ this happens in production. constantly.

And it gets worse. What if Thursday’s update was wrong? What if someone typed “Figma” when they meant “Firma”? Now you’ve lost the truth and introduced an error you can’t trace.

What bi-temporal means

Kronroe uses the bi-temporal model — a concept from database research (TSQL-2) that tracks two independent time dimensions on every fact:

Field Dimension What it captures
valid_from Valid time When the fact became true in the real world
valid_to Valid time When it stopped being true (None = still current)
recorded_at Transaction time When we first stored this fact
expired_at Transaction time When we superseded it (None = still the active record)

Valid time is about the world: when was Alice actually at Stripe? Transaction time is about the database: when did we learn this, and when did we correct it?

With both dimensions, you can answer questions that are impossible in a single-timeline system:

↳ four timestamps. four questions. one engine.

Why not bolt it on?

You could add timestamps to any database. Plenty of people do. But there’s a fundamental difference between timestamps as application metadata and timestamps as an engine primitive.

When bi-temporality is an engine feature:

None of this works if time is just a column you added to your schema.

The DuckDB analogy

DuckDB didn’t try to make SQLite better at analytics. It redesigned the engine — columnar storage, vectorised execution, zero external dependencies — for a different workload.

Kronroe takes the same approach for temporal knowledge. The engine is designed from scratch for facts that change over time:


Two markets, one engine

We built Kronroe for two audiences that need the same thing:

1. AI agent memory

Agents need to remember, correct, and reason over time. The current stack — a vector DB for retrieval, a separate store for “memory,” maybe a graph for relationships — is fragile. Kronroe unifies all three: full-text search, vector similarity, and a temporal property graph in one embedded engine.

No server. No infra. Runs in-process, works offline, and every fact is automatically bi-temporal.

2. Mobile and edge

iOS and Android apps that need relationship graphs — social features, recommendation engines, local-first knowledge bases — currently have no good embedded option. SQLite doesn’t model graphs. Neo4j requires a server. Kronroe runs natively on both platforms as a static library (iOS) or JNI dynamic library (Android).

↳ we eat our own cooking — Kindly Roe (our iOS app) uses Kronroe for its on-device graph

What comes next

Kronroe is early. The engine works. The MCP server exposes 11 tools for agent integration. Python bindings are on PyPI. iOS and Android targets compile and pass tests.

We’re building in the open — the source is on GitHub under AGPL-3.0 (with a commercial licence for apps that need it).

If you’re building an AI agent that needs to remember things properly, or a mobile app that needs an embedded graph, we’d love for you to try it. Start with the docs, install the Python package, or drop the MCP server into your agent’s tool loop.

Facts change. Your database should know that.