🤖 AI Summary
This paper investigates the physical origin of “consciousness emergence” in large language models (LLMs), positing it as a critical phase transition in high-dimensional disordered neural networks, analogous to the jamming transition in granular matter.
Method: It introduces temperature, volume fraction, and stress as control parameters for neural networks—first establishing a generalized jamming phase diagram—and employs computational annealing, density optimization, and noise suppression to characterize the formation mechanism of the critical jamming surface.
Contributions/Results: (1) It formalizes consciousness as a modelable jammed phase, identifying long-range correlations as decisive for knowledge integration; (2) it provides a thermodynamic unification of empirical scaling laws in AI by tracing their physical origins to critical phenomena; (3) it empirically confirms that LLMs at criticality exhibit divergent correlation lengths and universal scaling exponents—establishing the first rigorous phase-transition–based theoretical framework for intelligence emergence.
📝 Abstract
This paper develops a neural jamming phase diagram that interprets the emergence of consciousness in large language models as a critical phenomenon in high-dimensional disordered systems.By establishing analogies with jamming transitions in granular matter and other complex systems, we identify three fundamental control parameters governing the phase behavior of neural networks: temperature, volume fraction, and stress.The theory provides a unified physical explanation for empirical scaling laws in artificial intelligence, demonstrating how computational cooling, density optimization, and noise reduction collectively drive systems toward a critical jamming surface where generalized intelligence emerges. Remarkably, the same thermodynamic principles that describe conventional jamming transitions appear to underlie the emergence of consciousness in neural networks, evidenced by shared critical signatures including divergent correlation lengths and scaling exponents.Our work explains neural language models' critical scaling through jamming physics, suggesting consciousness is a jamming phase that intrinsically connects knowledge components via long-range correlations.