๐ค AI Summary
Existing retrieval-augmented approaches struggle to handle the fragmented nature of long-term agent memory and inadequately support complex temporal and multi-hop reasoning tasks. This work proposes a dynamic schematic memory architecture that emulates the spreading activation mechanism from cognitive science, dynamically selecting relevant subgraphs through lateral inhibition and temporal decay to enable synergistic retrieval of semantic and episodic memory. By moving beyond static vector similarity, the model mitigates the โcontext tunnelingโ problem and integrates geometric embedding with activation-driven graph traversal to form a tripartite hybrid retrieval strategy. Evaluated on the LoCoMo benchmark, the proposed method significantly outperforms current state-of-the-art approaches, demonstrating superior performance in complex reasoning scenarios.
๐ Abstract
While Large Language Models (LLMs) excel at generalized reasoning, standard retrieval-augmented approaches fail to address the disconnected nature of long-term agentic memory. To bridge this gap, we introduce Synapse (Synergistic Associative Processing Semantic Encoding), a unified memory architecture that transcends static vector similarity. Drawing from cognitive science, Synapse models memory as a dynamic graph where relevance emerges from spreading activation rather than pre-computed links. By integrating lateral inhibition and temporal decay, the system dynamically highlights relevant sub-graphs while filtering interference. We implement a Triple Hybrid Retrieval strategy that fuses geometric embeddings with activation-based graph traversal. Comprehensive evaluations on the LoCoMo benchmark show that Synapse significantly outperforms state-of-the-art methods in complex temporal and multi-hop reasoning tasks, offering a robust solution to the"Contextual Tunneling"problem. Our code and data will be made publicly available upon acceptance.