Random Tree Model of Meaningful Memory

📅 2024-12-02
🏛️ bioRxiv
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional memory research focuses on the semantic structure of individual narratives but lacks quantitative characterization of cross-narrative regularities in recall. Method: We propose a theoretical framework based on statistical ensembles of random trees, modeling narratives as hierarchical key-point structures constrained by working memory capacity, where nodes compress leaf nodes (original segments) via hierarchical compression. Contribution/Results: Using analytical statistical mechanics and hierarchical compression theory, we discover universal empirical laws—including sublinear growth of recall length and increasing sentence-level summarization span—and rigorously derive a scale-invariant limiting distribution for recalled sentences in long narratives. Our model yields closed-form analytical solutions and achieves high quantitative agreement with large-scale experimental data, accurately predicting the recall-length–narrative-length relationship, the evolution of per-sentence summarization span, and the convergence of recall distributions. This constitutes the first analytically solvable, universal model of narrative memory.

Technology Category

Application Category

📝 Abstract
Traditional studies of memory for meaningful narratives focus on specific stories and their semantic structures but do not address common quantitative features of recall across different narratives. We introduce a statistical ensemble of random trees to represent narratives as hierarchies of key points, where each node is a compressed representation of its descendant leaves, which are the original narrative segments. Recall is modeled as constrained by working memory capacity from this hierarchical structure. Our analytical solution aligns with observations from large-scale narrative recall experiments. Specifically, our model explains that (1) average recall length increases sublinearly with narrative length, and (2) individuals summarize increasingly longer narrative segments in each recall sentence. Additionally, the theory predicts that for sufficiently long narratives, a universal, scale-invariant limit emerges, where the fraction of a narrative summarized by a single recall sentence follows a distribution independent of narrative length.
Problem

Research questions and friction points this paper is trying to address.

Modeling memory recall for narratives
Using random trees to represent narrative hierarchies
Explaining scale-invariant recall patterns
Innovation

Methods, ideas, or system contributions that make the work stand out.

Random Tree Model
Hierarchical Representation
Working Memory Constraints
🔎 Similar Papers
No similar papers found.
Weishun Zhong
Weishun Zhong
Institute for Advanced Study
statistical physicsneural networksmachine learning
Tankut Can
Tankut Can
Emory University
NeuroscienceMachine LearningPhysics
A
Antonis Georgiou
School of Natural Sciences, Institute for Advanced Study, Princeton, NJ, 08540, USA; Department of Brain Sciences, Weizmann Institute of Science, Rehovot, 76100, Israel
I
Ilya Shnayderman
Department of Brain Sciences, Weizmann Institute of Science, Rehovot, 76100, Israel
Mikhail Katkov
Mikhail Katkov
Weizmann Institute of Science
Misha Tsodyks
Misha Tsodyks
Weizmann Institute
Computational neurosciencenonlinear dynamicsstatistical physicscognitive science