Learning Generalized Hamiltonian Dynamics with Stability from Noisy Trajectory Data

📅 2025-09-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
For highly noisy and sparsely sampled phase-space trajectory data, this paper proposes an unsupervised learning framework for unified modeling of conservative, dissipative, and port-Hamiltonian systems. Methodologically, we introduce a sparse symplectic Gaussian process model grounded in variational Bayesian inference, incorporating symplectic geometric constraints, stability-inducing priors, and energy-conservation regularization; stochastic Fourier features accelerate kernel computation, while the evidence lower bound (ELBO) is jointly optimized with multi-gradient physics-informed consistency losses. Our key contribution is the first integration of variational inference with explicit symplectic structure embedding into a single framework—enabling simultaneous dynamical system identification, uncertainty quantification, and long-term stable prediction. Experiments demonstrate high-fidelity reconstruction of system evolution under strong noise, bounded prediction error, and physically consistent long-horizon behavior.

Technology Category

Application Category

📝 Abstract
We introduce a robust framework for learning various generalized Hamiltonian dynamics from noisy, sparse phase-space data and in an unsupervised manner based on variational Bayesian inference. Although conservative, dissipative, and port-Hamiltonian systems might share the same initial total energy of a closed system, it is challenging for a single Hamiltonian network model to capture the distinctive and varying motion dynamics and physics of a phase space, from sampled observational phase space trajectories. To address this complicated Hamiltonian manifold learning challenge, we extend sparse symplectic, random Fourier Gaussian processes learning with predictive successive numerical estimations of the Hamiltonian landscape, using a generalized form of state and conjugate momentum Hamiltonian dynamics, appropriate to different classes of conservative, dissipative and port-Hamiltonian physical systems. In addition to the kernelized evidence lower bound (ELBO) loss for data fidelity, we incorporate stability and conservation constraints as additional hyper-parameter balanced loss terms to regularize the model's multi-gradients, enforcing physics correctness for improved prediction accuracy with bounded uncertainty.
Problem

Research questions and friction points this paper is trying to address.

Learning generalized Hamiltonian dynamics from noisy data
Capturing distinct motion dynamics in phase space
Enforcing stability and conservation constraints for accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Variational Bayesian inference for unsupervised Hamiltonian learning
Sparse symplectic Gaussian processes with Fourier features
Stability-constrained ELBO loss with multi-gradient regularization
🔎 Similar Papers
No similar papers found.
L
Luke McLennan
Oden Institute for Computational Engineering and Sciences, The University of Texas at Austin
Y
Yi Wang
Oden Institute for Computational Engineering and Sciences, The University of Texas at Austin
R
Ryan Farell
Operations Research and Industrial Engineering, The University of Texas at Austin
M
Minh Nguyen
Department of Mathematics, The University of Texas at Austin
Chandrajit Bajaj
Chandrajit Bajaj
Computational Applied Mathematics Chair, Professor of Computer Science,
Comp. Math.Data SciencesImage ProcessingGeometric ModelingStructural Bioinformatics and Visualization