Algorithmic Consequences of Particle Filters for Sentence Processing: Amplified Garden-Paths and Digging-In Effects

📅 2026-03-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of large language models in adequately capturing the cognitive load induced by structural ambiguities—such as garden-path effects and the digging-in effect—during sentence processing. The authors propose a particle-filter-based syntactic processing model that explicitly represents and dynamically updates syntactic hypotheses using a finite set of particles, thereby simulating real-time human decision-making in ambiguous sentence comprehension. For the first time at the algorithmic level, they demonstrate that the resampling mechanism inherent in particle filtering naturally gives rise to the digging-in effect, whose magnitude is inversely proportional to the number of particles—a finding that challenges predictions of fully parallel processing models. The model not only quantitatively accounts for the amplification of garden-path effects but also provides crucial computational evidence for the causal role of structural ambiguity in sentence processing.

Technology Category

Application Category

📝 Abstract
Under surprisal theory, linguistic representations affect processing difficulty only through the bottleneck of surprisal. Our best estimates of surprisal come from large language models, which have no explicit representation of structural ambiguity. While LLM surprisal robustly predicts reading times across languages, it systematically underpredicts difficulty when structural expectations are violated -- suggesting that representations of ambiguity are causally implicated in sentence processing. Particle filter models offer an alternative where structural hypotheses are explicitly represented as a finite set of particles. We prove several algorithmic consequences of particle filter models, including the amplification of garden-path effects. Most critically, we demonstrate that resampling, a common practice with these models, inherently produces real-time digging-in effects -- where disambiguation difficulty increases with ambiguous region length. Digging-in magnitude scales inversely with particle count: fully parallel models predict no such effect.
Problem

Research questions and friction points this paper is trying to address.

sentence processing
structural ambiguity
garden-path effects
digging-in effects
surprisal theory
Innovation

Methods, ideas, or system contributions that make the work stand out.

particle filter
garden-path effect
digging-in effect
resampling
structural ambiguity
🔎 Similar Papers
No similar papers found.