CAM: A Constructivist View of Agentic Memory for LLM-Based Reading Comprehension

📅 2025-10-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address information overload and inefficient memory organization in large language models (LLMs) during long-document understanding, this paper proposes the first agent memory framework grounded in Piagetian constructivist theory. Methodologically, it introduces structured schema representations, scalable assimilation, and dynamic accommodation mechanisms to enable hierarchical memory construction and context-sensitive activation. Technically, we design an incremental overlapping clustering algorithm to build a dynamic memory graph, integrating online batch-wise consolidation with query-driven retrieval to emulate human-like associative reading. Evaluated on question answering, query-focused summarization, and claim verification tasks, our approach significantly improves response coherence, accuracy, and reasoning efficiency. The framework establishes a novel paradigm for LLM-based long-text comprehension—cognitively plausible, theoretically principled, and engineering-practical—bridging foundational cognitive science with scalable AI system design.

Technology Category

Application Category

📝 Abstract
Current Large Language Models (LLMs) are confronted with overwhelming information volume when comprehending long-form documents. This challenge raises the imperative of a cohesive memory module, which can elevate vanilla LLMs into autonomous reading agents. Despite the emergence of some heuristic approaches, a systematic design principle remains absent. To fill this void, we draw inspiration from Jean Piaget's Constructivist Theory, illuminating three traits of the agentic memory -- structured schemata, flexible assimilation, and dynamic accommodation. This blueprint forges a clear path toward a more robust and efficient memory system for LLM-based reading comprehension. To this end, we develop CAM, a prototype implementation of Constructivist Agentic Memory that simultaneously embodies the structurality, flexibility, and dynamicity. At its core, CAM is endowed with an incremental overlapping clustering algorithm for structured memory development, supporting both coherent hierarchical summarization and online batch integration. During inference, CAM adaptively explores the memory structure to activate query-relevant information for contextual response, akin to the human associative process. Compared to existing approaches, our design demonstrates dual advantages in both performance and efficiency across diverse long-text reading comprehension tasks, including question answering, query-based summarization, and claim verification.
Problem

Research questions and friction points this paper is trying to address.

Addressing information overload in LLMs for long-document comprehension
Developing systematic memory design inspired by Constructivist Theory
Creating structured yet flexible memory for query-driven information retrieval
Innovation

Methods, ideas, or system contributions that make the work stand out.

Constructivist Theory-based memory system design
Incremental overlapping clustering for structured memory
Adaptive memory exploration for query-relevant activation
🔎 Similar Papers
No similar papers found.
R
Rui Li
Gaoling School of Artificial Intelligence, Renmin University of China
Z
Zeyu Zhang
Gaoling School of Artificial Intelligence, Renmin University of China
Xiaohe Bo
Xiaohe Bo
Gaoling School of Artificial Intelligence, Renmin University of China
large language models
Zihang Tian
Zihang Tian
Doctor at Gaoling School of AI
LLM-Based Agent
X
Xu Chen
Gaoling School of Artificial Intelligence, Renmin University of China
Q
Quanyu Dai
Huawei Noah’s Ark Lab
Zhenhua Dong
Zhenhua Dong
Noah's ark lab, Huawei Technologies Co., Ltd.
Recommender systemcausal inferencecountrfactual learningtrustworthy AImachine learning
R
Ruiming Tang
Huawei Noah’s Ark Lab