🤖 AI Summary
This work addresses the limitation of conventional Hamiltonian Neural Networks (HNNs) in modeling multi-timescale dynamical systems due to spectral bias. The authors propose Frequency-Separable Hamiltonian Neural Networks (FS-HNN), which explicitly decompose the Hamiltonian into fast and slow dynamical components, each modeled by dedicated subnetworks. These subnetworks are jointly trained on multiscale sampled data, enabling reconstruction of the full Hamiltonian structure without requiring strong domain-specific assumptions. The framework is further extended to partial differential equations by learning state- and boundary-condition-dependent symplectic operators. Evaluated across diverse ordinary and partial differential equation systems, FS-HNN demonstrates significantly improved long-term extrapolation accuracy and superior generalization capability across multiple timescales.
📝 Abstract
While Hamiltonian mechanics provides a powerful inductive bias for neural networks modeling dynamical systems, Hamiltonian Neural Networks and their variants often fail to capture complex temporal dynamics spanning multiple timescales. This limitation is commonly linked to the spectral bias of deep neural networks, which favors learning low-frequency, slow-varying dynamics. Prior approaches have sought to address this issue through symplectic integration schemes that enforce energy conservation or by incorporating geometric constraints to impose structure on the configuration-space. However, such methods either remain limited in their ability to fully capture multiscale dynamics or require substantial domain specific assumptions. In this work, we exploit the observation that Hamiltonian functions admit decompositions into explicit fast and slow modes and can be reconstructed from these components. We introduce the Frequency-Separable Hamiltonian Neural Network (FS-HNN), which parameterizes the system Hamiltonian using multiple networks, each governed by Hamiltonian dynamics and trained on data sampled at distinct timescales. We further extend this framework to partial differential equations by learning a state- and boundary-conditioned symplectic operators. Empirically, we show that FS-HNN improves long-horizon extrapolation performance on challenging dynamical systems and generalizes across a broad range of ODE and PDE problems.