archon-memory-core vs Letta

Letta (formerly MemGPT) is a full agent runtime. archon-memory-core is a memory library. Most teams evaluating both are really asking "do I want a framework that includes memory, or memory I can plug into my own framework?"

Short answer: If you want a batteries-included agent server and are willing to adopt its abstractions, Letta is a solid pick. If you already have a framework (LangGraph, CrewAI, custom) and just need durable memory that resolves contradictions, use archon-memory-core. They compose — you don't have to choose.

What each one actually is

Letta is an agent runtime that descends from the MemGPT paper. It ships as a Docker-deployed server with PostgreSQL, a REST API, a Python SDK, and a UI. Memory is a first-class concept: core memory blocks edited by the agent, archival memory backed by a vector store, and a recall mechanism triggered by context pressure. You write agents against Letta's abstractions.

archon-memory-core is a Python library. pip install archon-memory-core, import the client, call write() and retrieve(). It runs in-process against SQLite or your existing Postgres. There is no server, no runtime, no agent abstraction — just memory primitives with explicit persistence classes and contradiction resolution at retrieval.

Feature comparison

CapabilityLettaarchon-memory-core
Install footprintDocker + Postgres + serverpip install
LicenseApache 2.0Apache 2.0
Runtime scopeFull agent runtimeMemory library only
Contradiction handlingAgent rewrites core blockRetriever scores persistence + priority
Benchmarked retrievalMemGPT paper (2023)AMB v2.3 (99.2% top-1)
Framework couplingAdopt Letta primitivesFramework-agnostic
Works with LangGraph / CrewAI / customVia adapterDirect, any framework
Ops surfaceServer to runNone (in-process)

When to pick Letta

When to pick archon-memory-core

Using them together

Letta's core memory is great for the agent's working context (persona, current goals). archon-memory-core is great for the long tail of facts that need to survive, evolve, and sometimes contradict. Wire archon-memory-core as a tool Letta can call — keep Letta's runtime, add explicit contradiction handling underneath.

FAQ

Is Letta a replacement for archon-memory-core?
No. Letta is a full agent runtime that happens to include memory. archon-memory-core is a standalone memory library. Different layers of the stack.
Can I use archon-memory-core with Letta?
Yes. Expose it as a tool or sidecar and have Letta call it for durable fact storage, while Letta's native memory handles working context.
Which handles contradictions better?
archon-memory-core resolves contradictions at retrieval using persistence class and priority. Letta relies on the agent to rewrite its core memory block — which compounds errors if the agent misjudges.
What's the setup cost difference?
archon-memory-core: pip install, one import. Letta: Docker, Postgres, server, framework adapter.
View archon-memory-core on GitHub