Time-adaptive SympNets for separable Hamiltonian systems

📅 2025-09-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing symplectic neural networks (e.g., SympNets) require uniformly sampled training data, limiting their applicability to real-world Hamiltonian systems observed on non-uniform time grids. Method: We propose Time-adaptive Symplectic Networks (TSympNets), the first framework enabling modeling of both separable and non-autonomous Hamiltonian systems on arbitrary, non-uniform time meshes. TSympNets integrate time-adaptive mechanisms into the SympNets architecture by unifying numerical ODE solvers with deep learning, while rigorously preserving the symplectic structure. Contribution/Results: Theoretically, we establish a universal approximation theorem for TSympNets on separable Hamiltonian systems, rectify a critical flaw in prior symmetric mapping approximation proofs, and prove that such approximation cannot be extended to non-separable cases. Experimentally, TSympNets achieve significantly higher accuracy and robustness compared to state-of-the-art baselines across diverse non-uniform sampling regimes.

Technology Category

Application Category

📝 Abstract
Measurement data is often sampled irregularly i.e. not on equidistant time grids. This is also true for Hamiltonian systems. However, existing machine learning methods, which learn symplectic integrators, such as SympNets [20] and HénonNets [4] still require training data generated by fixed step sizes. To learn time-adaptive symplectic integrators, an extension to SympNets, which we call TSympNets, was introduced in [20]. We adapt the architecture of TSympNets and extend them to non-autonomous Hamiltonian systems. So far the approximation qualities of TSympNets were unknown. We close this gap by providing a universal approximation theorem for separable Hamiltonian systems and show that it is not possible to extend it to non-separable Hamiltonian systems. To investigate these theoretical approximation capabilities, we perform different numerical experiments. Furthermore we fix a mistake in a proof of a substantial theorem [25, Theorem 2] for the approximation of symplectic maps in general, but specifically for symplectic machine learning methods.
Problem

Research questions and friction points this paper is trying to address.

Learning time-adaptive symplectic integrators for irregular data
Extending TSympNets to non-autonomous Hamiltonian systems
Providing approximation theory for separable Hamiltonian systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Time-adaptive SympNets architecture extension
Universal approximation theorem for separable systems
Numerical experiments validate theoretical capabilities
🔎 Similar Papers
No similar papers found.