🤖 AI Summary
Addressing challenges in chaotic neural population dynamics—including difficulty in interpretable equation discovery, poor noise robustness, and unobservable states—this paper proposes PEM-UDE, a novel framework integrating Prediction Error Minimization (PEM) with Universal Differential Equations (UDE), augmented by symbolic regression and sparse network regularization. It introduces, for the first time in neural modeling, a connectivity density constraint, revealing its emergent relationships with oscillation frequency and synchronization. By smoothing the optimization landscape, PEM-UDE suppresses chaotic sensitivity, enabling accurate recovery of hidden states and faithful reconstruction of underlying dynamics. The method successfully recovers ground-truth equations from both the Rössler system and circuit data corrupted by strong noise (SNR = −14 dB). Furthermore, it demonstrates cross-regional generalizability and neurobiological interpretability when applied to electrophysiological recordings from three distinct brain regions.
📝 Abstract
Discovering governing equations that describe complex chaotic systems remains a fundamental challenge in physics and neuroscience. Here, we introduce the PEM-UDE method, which combines the prediction-error method with universal differential equations to extract interpretable mathematical expressions from chaotic dynamical systems, even with limited or noisy observations. This approach succeeds where traditional techniques fail by smoothing optimization landscapes and removing the chaotic properties during the fitting process without distorting optimal parameters. We demonstrate its efficacy by recovering hidden states in the Rossler system and reconstructing dynamics from noise-corrupted electrical circuit data, where the correct functional form of the dynamics is recovered even when one of the observed time series is corrupted by noise 5x the magnitude of the true signal. We demonstrate that this method is capable of recovering the correct dynamics, whereas direct symbolic regression methods, such as SINDy, fail to do so with the given amount of data and noise. Importantly, when applied to neural populations, our method derives novel governing equations that respect biological constraints such as network sparsity - a constraint necessary for cortical information processing yet not captured in next-generation neural mass models - while preserving microscale neuronal parameters. These equations predict an emergent relationship between connection density and both oscillation frequency and synchrony in neural circuits. We validate these predictions using three intracranial electrode recording datasets from the medial entorhinal cortex, prefrontal cortex, and orbitofrontal cortex. Our work provides a pathway to develop mechanistic, multi-scale brain models that generalize across diverse neural architectures, bridging the gap between single-neuron dynamics and macroscale brain activity.