🤖 AI Summary
This work investigates the long-term clustering dynamics of token representations during forward propagation in deep Transformers. Methodologically, it models token evolution as an interacting particle system within a mean-field limit framework. Theoretically, it establishes the first quantitative convergence analysis of such clustering: under reasonable regularization and parameter assumptions, the representation distribution contracts exponentially—toward a single-point Dirac mass—at an explicitly computable rate; moreover, under mean-field initialization, tokens achieve exponential synchronization. These results uncover a fundamental connection between internal representation collapse in Transformers and Kuramoto-type phase synchronization. By integrating tools from stochastic differential equations, nonlinear dynamical systems, and functional inequalities, the study provides the first rigorous, quantifiable theoretical foundation for understanding representation degeneration in large language models.
📝 Abstract
The evolution of tokens through a deep transformer models can be modeled as an interacting particle system that has been shown to exhibit an asymptotic clustering behavior akin to the synchronization phenomenon in Kuramoto models. In this work, we investigate the long-time clustering of mean-field transformer models. More precisely, we establish exponential rates of contraction to a Dirac point mass for any suitably regular initialization under some assumptions on the parameters of transformer models, any suitably regular mean-field initialization synchronizes exponentially fast with some quantitative rates.