🤖 AI Summary
This study investigates whether phase transitions in language models arise from the intrinsic structural properties of language itself rather than from long-range interactions. To this end, the authors construct a context-sensitive stochastic language model featuring only short-range interactions and a fixed context length, integrating Chomskyan hierarchy grammars with Monte Carlo simulations and statistical mechanical analysis. They report, for the first time in such models, the emergence of a finite-temperature phase transition, demonstrating that the inherent structural characteristics of language alone are sufficient to induce critical behavior—even in the absence of long-range correlations. These findings underscore the pivotal role of language’s fundamental features in driving complex emergent phenomena and offer a novel perspective on the origins of phase transitions observed in modern language models.
📝 Abstract
Since the random language model was proposed by E. DeGiuli [Phys. Rev. Lett. 122, 128301], language models have been investigated intensively from the viewpoint of statistical mechanics. Recently, the existence of a Berezinskii--Kosterlitz--Thouless transition was numerically demonstrated in models with long-range interactions between symbols. In statistical mechanics, it has long been known that long-range interactions can induce phase transitions. Therefore, it has remained unclear whether phase transitions observed in language models originate from genuinely linguistic properties that are absent in conventional spin models. In this study, we construct a random language model with short-range interactions and numerically investigate its statistical properties. Our model belongs to the class of context-sensitive grammars in the Chomsky hierarchy and allows explicit reference to contexts. We find that a phase transition occurs even when the model refers only to contexts whose length remains constant with respect to the sentence length. This result indicates that finite-temperature phase transitions in language models are genuinely induced by the intrinsic nature of language, rather than by long-range interactions.