🤖 AI Summary
For highly noisy and sparsely sampled phase-space trajectory data, this paper proposes an unsupervised learning framework for unified modeling of conservative, dissipative, and port-Hamiltonian systems. Methodologically, we introduce a sparse symplectic Gaussian process model grounded in variational Bayesian inference, incorporating symplectic geometric constraints, stability-inducing priors, and energy-conservation regularization; stochastic Fourier features accelerate kernel computation, while the evidence lower bound (ELBO) is jointly optimized with multi-gradient physics-informed consistency losses. Our key contribution is the first integration of variational inference with explicit symplectic structure embedding into a single framework—enabling simultaneous dynamical system identification, uncertainty quantification, and long-term stable prediction. Experiments demonstrate high-fidelity reconstruction of system evolution under strong noise, bounded prediction error, and physically consistent long-horizon behavior.
📝 Abstract
We introduce a robust framework for learning various generalized Hamiltonian dynamics from noisy, sparse phase-space data and in an unsupervised manner based on variational Bayesian inference. Although conservative, dissipative, and port-Hamiltonian systems might share the same initial total energy of a closed system, it is challenging for a single Hamiltonian network model to capture the distinctive and varying motion dynamics and physics of a phase space, from sampled observational phase space trajectories. To address this complicated Hamiltonian manifold learning challenge, we extend sparse symplectic, random Fourier Gaussian processes learning with predictive successive numerical estimations of the Hamiltonian landscape, using a generalized form of state and conjugate momentum Hamiltonian dynamics, appropriate to different classes of conservative, dissipative and port-Hamiltonian physical systems. In addition to the kernelized evidence lower bound (ELBO) loss for data fidelity, we incorporate stability and conservation constraints as additional hyper-parameter balanced loss terms to regularize the model's multi-gradients, enforcing physics correctness for improved prediction accuracy with bounded uncertainty.