🤖 AI Summary
How can physically consistent continuous-time dynamics and a posterior distribution over Hamiltonian functions be learned from noisy, irregularly sampled trajectory data while strictly enforcing energy conservation and passivity constraints? This paper proposes the multi-step port-Hamiltonian Gaussian process (MS-PHS GP): it places a Gaussian process prior on the Hamiltonian function and encodes physical constraints—derived from a variable-step, multi-step numerical integrator—into linear functionals. This enables closed-form conditional inference over both the vector field and the Hamiltonian surface, without latent variables, thereby inherently guaranteeing energy balance and port-Hamiltonian structure. Leveraging finite-sample error analysis and discretization-aware constraints, the method achieves significantly improved vector field recovery accuracy on benchmark systems—including the harmonic oscillator, van der Pol oscillator, and Duffing oscillator—and provides well-calibrated uncertainty quantification over the Hamiltonian.
📝 Abstract
We propose the multistep port-Hamiltonian Gaussian process (MS-PHS GP) to learn physically consistent continuous-time dynamics and a posterior over the Hamiltonian from noisy, irregularly-sampled trajectories. By placing a GP prior on the Hamiltonian surface $H$ and encoding variable-step multistep integrator constraints as finite linear functionals, MS-PHS GP enables closed-form conditioning of both the vector field and the Hamiltonian surface without latent states, while enforcing energy balance and passivity by design. We state a finite-sample vector-field bound that separates the estimation and variable-step discretization terms. Lastly, we demonstrate improved vector-field recovery and well-calibrated Hamiltonian uncertainty on mass-spring, Van der Pol, and Duffing benchmarks.