🤖 AI Summary
Traditional memory research focuses on the semantic structure of individual narratives but lacks quantitative characterization of cross-narrative regularities in recall.
Method: We propose a theoretical framework based on statistical ensembles of random trees, modeling narratives as hierarchical key-point structures constrained by working memory capacity, where nodes compress leaf nodes (original segments) via hierarchical compression.
Contribution/Results: Using analytical statistical mechanics and hierarchical compression theory, we discover universal empirical laws—including sublinear growth of recall length and increasing sentence-level summarization span—and rigorously derive a scale-invariant limiting distribution for recalled sentences in long narratives. Our model yields closed-form analytical solutions and achieves high quantitative agreement with large-scale experimental data, accurately predicting the recall-length–narrative-length relationship, the evolution of per-sentence summarization span, and the convergence of recall distributions. This constitutes the first analytically solvable, universal model of narrative memory.
📝 Abstract
Traditional studies of memory for meaningful narratives focus on specific stories and their semantic structures but do not address common quantitative features of recall across different narratives. We introduce a statistical ensemble of random trees to represent narratives as hierarchies of key points, where each node is a compressed representation of its descendant leaves, which are the original narrative segments. Recall is modeled as constrained by working memory capacity from this hierarchical structure. Our analytical solution aligns with observations from large-scale narrative recall experiments. Specifically, our model explains that (1) average recall length increases sublinearly with narrative length, and (2) individuals summarize increasingly longer narrative segments in each recall sentence. Additionally, the theory predicts that for sufficiently long narratives, a universal, scale-invariant limit emerges, where the fraction of a narrative summarized by a single recall sentence follows a distribution independent of narrative length.