archon-memory-core vs LangGraph Store
LangGraph Store is LangChain's built-in memory primitive — a namespaced key-value store with optional vector search. archon-memory-core is a framework-agnostic memory library with opinionated contradiction handling. One is a primitive; the other is a policy.
What each one actually is
LangGraph Store is a pluggable interface with an in-memory default and Postgres / SQLite implementations. You get put, get, search against namespaced scopes. Vector search is optional. It's a durable bag you fill however you like — the retrieval policy and conflict handling are up to you.
archon-memory-core has a narrower API but makes opinionated decisions. Every memory has a persistence class (ephemeral, session, long-term, canonical) and an optional priority. Retrieval ranks by a combination of embedding similarity, persistence, and priority. Contradictions coexist in storage and get sorted at query time.
Feature comparison
| Capability | LangGraph Store | archon-memory-core |
|---|---|---|
| License | MIT | Apache 2.0 |
| Install | Part of LangGraph | pip install archon-memory-core |
| Framework coupling | Tightly integrated with LangGraph | Framework-agnostic |
| API shape | KV store (put / get / search) | Memory client (write / retrieve) |
| Persistence classes | Not built-in | Built-in (ephemeral → canonical) |
| Priority scoring | DIY | Built-in |
| Contradiction handling | DIY | Retrieval-time, deterministic |
| Benchmarks | None published specifically | AMB v2.3 (99.2% top-1) |
When to pick LangGraph Store
- You're already deep in LangGraph and want zero new dependencies.
- Your memory is simple: save a JSON blob, look it up later, maybe vector search.
- You want full control over what gets written and how it's retrieved.
When to pick archon-memory-core
- You want persistence class and priority as first-class concepts instead of reinventing them.
- You want contradiction resolution that doesn't require you to write ranking logic yourself.
- You want something that works unchanged if you swap LangGraph for CrewAI, Pydantic AI, or a custom stack.
- You care about the AMB v2.3 preregistered number (99.2% top-1 on contradiction retrieval).
Using them together
LangGraph Store is great for checkpointing, thread state, and anything that wants LangGraph's graph-scoped namespacing. Call archon-memory-core from inside a node when you need durable user-level memory with contradiction handling. A tiny adapter that implements LangGraph's BaseStore on top of archon-memory-core is on the roadmap — until then, direct invocation from a node works fine.