🤖 AI Summary
Learning governing equations from sparse, partially observed, and noisy time-series data remains challenging for dynamic systems.
Method: This paper proposes an integrated framework that jointly learns the ordinary differential equation (ODE) structure and the latent state trajectory. It combines library-based sparse regression with reproducing kernel Hilbert space (RKHS) modeling, discretizes the ODE via least-squares kernel collocation, and formulates a unified optimization objective coupling state estimation and equation discovery.
Contribution/Results: Unlike conventional “interpolate-then-identify” paradigms, our method simultaneously infers continuous state trajectories and governing ODEs within a shared RKHS, ensuring solution smoothness and physical consistency. Experiments demonstrate substantial improvements in equation identification accuracy and state reconstruction fidelity under low sampling rates and high noise levels. Data requirements are reduced by over 50%, and the method outperforms state-of-the-art approaches—including SINDy and DeepOde—in robustness and modeling flexibility.
📝 Abstract
We develop an all-at-once modeling framework for learning systems of ordinary differential equations (ODE) from scarce, partial, and noisy observations of the states. The proposed methodology amounts to a combination of sparse recovery strategies for the ODE over a function library combined with techniques from reproducing kernel Hilbert space (RKHS) theory for estimating the state and discretizing the ODE. Our numerical experiments reveal that the proposed strategy leads to significant gains in terms of accuracy, sample efficiency, and robustness to noise, both in terms of learning the equation and estimating the unknown states. This work demonstrates capabilities well beyond existing and widely used algorithms while extending the modeling flexibility of other recent developments in equation discovery.