🤖 AI Summary
To address the lack of theoretical guarantees on stability and robustness in learning-based neural dynamical systems, this paper proposes a modeling paradigm with provable global contraction. Methodologically, it introduces the first framework for verifying global contraction under arbitrary Riemannian metrics; incorporates an extended linearization parameterization coupled with a learnable diffeomorphic mapping from data space to latent space, ensuring global exponential contraction in the original state space; and jointly integrates implicit metric learning with Riemannian contraction theory for end-to-end stability embedding. Evaluated on high-dimensional benchmarks—including LASA, multi-link pendulums, and the Rosenbrock system—the proposed model achieves both global exponential stability and strong robustness, significantly improving open-loop prediction accuracy and closed-loop control reliability.
📝 Abstract
Global stability and robustness guarantees in learned dynamical systems are essential to ensure well-behavedness of the systems in the face of uncertainty. We present Extended Linearized Contracting Dynamics (ELCD), the first neural network-based dynamical system with global contractivity guarantees in arbitrary metrics. The key feature of ELCD is a parametrization of the extended linearization of the nonlinear vector field. In its most basic form, ELCD is guaranteed to be (i) globally exponentially stable, (ii) equilibrium contracting, and (iii) globally contracting with respect to some metric. To allow for contraction with respect to more general metrics in the data space, we train diffeomorphisms between the data space and a latent space and enforce contractivity in the latent space, which ensures global contractivity in the data space. We demonstrate the performance of ELCD on the high dimensional LASA, multi-link pendulum, and Rosenbrock datasets.