Information mechanics: conservation and exchange

📅 2026-01-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work transcends conventional optimization paradigms by introducing an information-theoretic framework grounded in Bayesian updating, revealing fundamental conservation-law-constrained principles governing uncertainty reduction. Central to this framework is the information potential Φ—a non-additive state function that unifies Shannon entropy and Fisher information, precisely characterizing the trade-off between information acquisition and posterior uncertainty while isolating algorithm-agnostic informational invariants. By integrating tools from information geometry, conservation law analysis, and reparameterization invariance, the framework demonstrates that Φ quantifies the local sharpness and ruggedness of posterior distributions. In the low-temperature limit, Φ logarithmically relates to the number of local optima, thereby establishing a unified information-computation foundation applicable to both biological and artificial systems.

Technology Category

Application Category

📝 Abstract
Inference and learning are commonly cast in terms of optimisation, yet the fundamental constraints governing uncertainty reduction remain unclear. This work presents a first-principles framework inherent to Bayesian updating, termed information mechanics (infomechanics). Any pointwise reduction in posterior surprisal is exactly balanced by information gained from data, independently of algorithms, dynamics, or implementation. Imposing additivity, symmetry, and robustness collapses the freedom of this identity to only two independent conservation relations. One governs the global redistribution of uncertainty and recovers Shannon entropy. The other captures a complementary local geometric component, formalised as Fisher information. Together, these conserved quantities motivate a non-additive state function, the information potential $\Phi$, which isolates structural degrees of freedom beyond entropy while remaining invariant under reparametrisation. $\Phi$ quantifies local sharpness and ruggedness in posterior beliefs and vanishes uniquely for isotropic Gaussian distributions. In a low-temperature regime, $\Phi$ scales logarithmically with the effective number of local optima, linking information geometry to computational complexity. This formalises an information-computation exchange, whereby information acquisition reshapes the inference landscape and reduces computational demands. By separating invariant informational constraints from inference mechanisms, this framework provides a unified, algorithm-independent foundation for inference, learning, and computation across biological and artificial systems.
Problem

Research questions and friction points this paper is trying to address.

uncertainty reduction
information conservation
Bayesian updating
information geometry
computational complexity
Innovation

Methods, ideas, or system contributions that make the work stand out.

information mechanics
Bayesian updating
conservation laws
Fisher information
information potential
🔎 Similar Papers
No similar papers found.