🤖 AI Summary
Establishing rigorous links between information-theoretic characterizations of nontrivial collective phenomena and their underlying physical mechanisms in complex systems remains challenging.
Method: We construct a coupled stochastic walker model driven by Gaussian noise—the first analytically tractable microscopic dynamical system—and systematically investigate how partial information decomposition (PID), causal emergence, and integrated information Φ evolve with physical parameters such as coupling strength and noise amplitude.
Contribution/Results: We introduce the concept of *statistical autonomy* to explain spurious emergence in uncoupled systems; demonstrate that information measures evaluated at the microscale and intrinsic timescales more faithfully capture true causal structure; and show that disentangling dynamical processes from stationary distributions markedly enhances the physical interpretability of information-theoretic analyses. Our work provides a theoretical benchmark and methodological framework for the reliable application of information theory to mechanistic inference in complex systems.
📝 Abstract
Understanding a complex system entails capturing the non-trivial collective phenomena that arise from interactions between its different parts. Information theory is a flexible and robust framework to study such behaviours, with several measures designed to quantify and characterise the interdependencies among the system's components. However, since these estimators rely on the statistical distributions of observed quantities, it is crucial to examine the relationships between information-theoretic measures and the system's underlying mechanistic structure. To this end, here we present an information-theoretic analytical investigation of an elementary system of interactive random walkers subject to Gaussian noise. Focusing on partial information decomposition, causal emergence, and integrated information, our results help us develop some intuitions on their relationship with the physical parameters of the system. For instance, we observe that uncoupled systems can exhibit emergent properties, in a way that we suggest may be better described as ''statistically autonomous''. Overall, we observe that in this simple scenario information measures align more reliably with the system's mechanistic properties when calculated at the level of microscopic components, rather than their coarse-grained counterparts, and over timescales comparable with the system's intrinsic dynamics. Moreover, we show that approaches that separate the contributions of the system's dynamics and steady-state distribution (e.g. via causal perturbations) may help strengthen the interpretation of information-theoretic analyses.