Scientific Machine Learning of Chaotic Systems Discovers Governing Equations for Neural Populations

📅 2025-07-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing challenges in chaotic neural population dynamics—including difficulty in interpretable equation discovery, poor noise robustness, and unobservable states—this paper proposes PEM-UDE, a novel framework integrating Prediction Error Minimization (PEM) with Universal Differential Equations (UDE), augmented by symbolic regression and sparse network regularization. It introduces, for the first time in neural modeling, a connectivity density constraint, revealing its emergent relationships with oscillation frequency and synchronization. By smoothing the optimization landscape, PEM-UDE suppresses chaotic sensitivity, enabling accurate recovery of hidden states and faithful reconstruction of underlying dynamics. The method successfully recovers ground-truth equations from both the Rössler system and circuit data corrupted by strong noise (SNR = −14 dB). Furthermore, it demonstrates cross-regional generalizability and neurobiological interpretability when applied to electrophysiological recordings from three distinct brain regions.

Technology Category

Application Category

📝 Abstract
Discovering governing equations that describe complex chaotic systems remains a fundamental challenge in physics and neuroscience. Here, we introduce the PEM-UDE method, which combines the prediction-error method with universal differential equations to extract interpretable mathematical expressions from chaotic dynamical systems, even with limited or noisy observations. This approach succeeds where traditional techniques fail by smoothing optimization landscapes and removing the chaotic properties during the fitting process without distorting optimal parameters. We demonstrate its efficacy by recovering hidden states in the Rossler system and reconstructing dynamics from noise-corrupted electrical circuit data, where the correct functional form of the dynamics is recovered even when one of the observed time series is corrupted by noise 5x the magnitude of the true signal. We demonstrate that this method is capable of recovering the correct dynamics, whereas direct symbolic regression methods, such as SINDy, fail to do so with the given amount of data and noise. Importantly, when applied to neural populations, our method derives novel governing equations that respect biological constraints such as network sparsity - a constraint necessary for cortical information processing yet not captured in next-generation neural mass models - while preserving microscale neuronal parameters. These equations predict an emergent relationship between connection density and both oscillation frequency and synchrony in neural circuits. We validate these predictions using three intracranial electrode recording datasets from the medial entorhinal cortex, prefrontal cortex, and orbitofrontal cortex. Our work provides a pathway to develop mechanistic, multi-scale brain models that generalize across diverse neural architectures, bridging the gap between single-neuron dynamics and macroscale brain activity.
Problem

Research questions and friction points this paper is trying to address.

Discover governing equations for chaotic neural systems
Extract interpretable math from noisy chaotic data
Derive biologically constrained neural population dynamics
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines prediction-error method with universal differential equations
Smooths optimization landscapes and removes chaotic properties
Derives governing equations respecting biological constraints like sparsity
🔎 Similar Papers
No similar papers found.
A
Anthony G. Chesebro
Department of Biomedical Engineering and Laufer Center for Physical and Quantitative Biology, State University of New York at Stony Brook, NY, USA; Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital and Harvard Medical School, MA, USA
D
David Hofmann
Department of Biomedical Engineering and Laufer Center for Physical and Quantitative Biology, State University of New York at Stony Brook, NY, USA; Computer Science and Artificial Intelligence Lab, Massachusetts Institute of Technology, MA, USA
Vaibhav Dixit
Vaibhav Dixit
MIT
Earl K. Miller
Earl K. Miller
Picower Professor of Neuroscience, Massachusetts Institute of Technology
neurosciencecognitive science
R
Richard H. Granger
Psychological and Brain Sciences, Dartmouth College, NH, USA
Alan Edelman
Alan Edelman
Professor of Applied Mathematics, Member Computer Science AI LABS, MIT
CorgisRandom Matrix TheoryJuliaNumerical Linear AlgebraParallel Computing
C
Christopher V. Rackauckas
Computer Science and Artificial Intelligence Lab, Massachusetts Institute of Technology, MA, USA
L
Lilianne R. Mujica-Parodi
Department of Biomedical Engineering and Laufer Center for Physical and Quantitative Biology, State University of New York at Stony Brook, NY, USA; Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital and Harvard Medical School, MA, USA; Computer Science and Artificial Intelligence Lab, Massachusetts Institute of Technology, MA, USA; Santa Fe Institute, NM, USA
H
Helmut H. Strey
Department of Biomedical Engineering and Laufer Center for Physical and Quantitative Biology, State University of New York at Stony Brook, NY, USA; Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital and Harvard Medical School, MA, USA; Computer Science and Artificial Intelligence Lab, Massachusetts Institute of Technology, MA, USA