archon-memory-core vs Letta
Letta (formerly MemGPT) is a full agent runtime. archon-memory-core is a memory library. Most teams evaluating both are really asking "do I want a framework that includes memory, or memory I can plug into my own framework?"
What each one actually is
Letta is an agent runtime that descends from the MemGPT paper. It ships as a Docker-deployed server with PostgreSQL, a REST API, a Python SDK, and a UI. Memory is a first-class concept: core memory blocks edited by the agent, archival memory backed by a vector store, and a recall mechanism triggered by context pressure. You write agents against Letta's abstractions.
archon-memory-core is a Python library. pip install archon-memory-core, import the client, call write() and retrieve(). It runs in-process against SQLite or your existing Postgres. There is no server, no runtime, no agent abstraction — just memory primitives with explicit persistence classes and contradiction resolution at retrieval.
Feature comparison
| Capability | Letta | archon-memory-core |
|---|---|---|
| Install footprint | Docker + Postgres + server | pip install |
| License | Apache 2.0 | Apache 2.0 |
| Runtime scope | Full agent runtime | Memory library only |
| Contradiction handling | Agent rewrites core block | Retriever scores persistence + priority |
| Benchmarked retrieval | MemGPT paper (2023) | AMB v2.3 (99.2% top-1) |
| Framework coupling | Adopt Letta primitives | Framework-agnostic |
| Works with LangGraph / CrewAI / custom | Via adapter | Direct, any framework |
| Ops surface | Server to run | None (in-process) |
When to pick Letta
- You don't have an agent framework yet and want one with opinionated defaults.
- You want an out-of-the-box multi-agent server with a UI and REST endpoints.
- Your team is comfortable operating a Postgres-backed service.
When to pick archon-memory-core
- You already have a framework (LangGraph, CrewAI, Pydantic AI, something custom).
- You want contradiction resolution as a retrieval-time property, not an agent instruction.
- You don't want to run another server.
- You care about the AMB v2.3 preregistered result (99.2% top-1 on contradiction retrieval).
Using them together
Letta's core memory is great for the agent's working context (persona, current goals). archon-memory-core is great for the long tail of facts that need to survive, evolve, and sometimes contradict. Wire archon-memory-core as a tool Letta can call — keep Letta's runtime, add explicit contradiction handling underneath.
FAQ
pip install, one import. Letta: Docker, Postgres, server, framework adapter.