All-Mem: Agentic Lifelong Memory via Dynamic Topology Evolution

📅 2026-03-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the degradation in retrieval quality faced by lifelong interactive agents due to ever-growing memory stores plagued by redundancy, obsolescence, and noise. To tackle this challenge, the authors propose All-Mem, a framework that maintains a structured memory repository through dynamic topological evolution. During online interaction, retrieval is anchored to a bounded visible surface, while offline, a large language model–based diagnostic module drives non-destructive topological edits—such as SPLIT, MERGE, and UPDATE—augmented with confidence scores, all while preserving immutable evidence for traceability. Innovatively integrating a hybrid architecture, typed links, and hop-constrained retrieval, All-Mem circumvents the information loss inherent in conventional summarization-based compression. Evaluated on the LOCOMO and LONGMEMEVAL benchmarks, All-Mem significantly outperforms existing approaches, achieving consistent gains in both retrieval accuracy and downstream question-answering performance.

Technology Category

Application Category

📝 Abstract
Lifelong interactive agents are expected to assist users over months or years, which requires continually writing long term memories while retrieving the right evidence for each new query under fixed context and latency budgets. Existing memory systems often degrade as histories grow, yielding redundant, outdated, or noisy retrieved contexts. We present All-Mem, an online/offline lifelong memory framework that maintains a topology structured memory bank via explicit, non destructive consolidation, avoiding the irreversible information loss typical of summarization based compression. In online operation, it anchors retrieval on a bounded visible surface to keep coarse search cost bounded. Periodically offline, an LLM diagnoser proposes confidence scored topology edits executed with gating using three operators: SPLIT, MERGE, and UPDATE, while preserving immutable evidence for traceability. At query time, typed links enable hop bounded, budgeted expansion from active anchors to archived evidence when needed. Experiments on LOCOMO and LONGMEMEVAL show improved retrieval and QA over representative baselines.
Problem

Research questions and friction points this paper is trying to address.

lifelong memory
memory retrieval
interactive agents
context budget
latency constraint
Innovation

Methods, ideas, or system contributions that make the work stand out.

lifelong memory
dynamic topology evolution
non-destructive consolidation
typed links
LLM-guided memory editing
🔎 Similar Papers
No similar papers found.
C
Can Lv
Beijing Advanced Innovation Center for Future Blockchain and Privacy Computing, School of Artificial Intelligence, Beihang University
Heng Chang
Heng Chang
Tsinghua University
Trustworthy AIGraph Representation LearningData Mining
Yuchen Guo
Yuchen Guo
Tsinghua University
Machine LearningComputer VisionInformation Retrieval
S
Shengyu Tao
Department of Electrical Engineering, Chalmers University of Technology
Shiji Zhou
Shiji Zhou
Associate Professor, Beihang University
Online LearningStochastic OptimizationMulti-Objective OptimizationMulti-task Learning