A Riemannian Framework for Learning Reduced-order Lagrangian Dynamics

📅 2024-10-24
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the challenge of balancing model complexity and data efficiency in high-dimensional nonlinear dynamical systems modeling, this paper proposes a Riemannian geometry-driven deep learning framework that, for the first time, tightly integrates Riemannian manifold learning with model order reduction: jointly learning a symplectic-structure-preserving nonlinear dimensionality-reduction manifold and its associated low-dimensional Lagrangian dynamics. The method unifies Riemannian optimization, geometric deep networks, and physics-informed inductive biases—particularly Lagrangian priors—to guarantee physical consistency, interpretability, and out-of-distribution generalization. Evaluated on rigid-body and deformable-body systems, the framework achieves substantial improvements in long-term prediction accuracy, data efficiency, and cross-system generalization. Moreover, it enables interpretable inverse identification of dynamical parameters. This work establishes a new paradigm for high-dimensional dynamical modeling that bridges theoretical rigor—rooted in differential geometry and analytical mechanics—with practical engineering applicability.

Technology Category

Application Category

📝 Abstract
By incorporating physical consistency as inductive bias, deep neural networks display increased generalization capabilities and data efficiency in learning nonlinear dynamic models. However, the complexity of these models generally increases with the system dimensionality, requiring larger datasets, more complex deep networks, and significant computational effort. We propose a novel geometric network architecture to learn physically-consistent reduced-order dynamic parameters that accurately describe the original high-dimensional system behavior. This is achieved by building on recent advances in model-order reduction and by adopting a Riemannian perspective to jointly learn a non-linear structure-preserving latent space and the associated low-dimensional dynamics. Our approach enables accurate long-term predictions of the high-dimensional dynamics of rigid and deformable systems with increased data efficiency by inferring interpretable and physically-plausible reduced Lagrangian models.
Problem

Research questions and friction points this paper is trying to address.

Learning reduced-order Lagrangian dynamics efficiently
Improving data efficiency in nonlinear dynamic models
Accurate long-term predictions of high-dimensional systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Geometric network architecture for reduced-order dynamics
Riemannian perspective for structure-preserving latent space
Physically-consistent Lagrangian models for efficient predictions