Stable Port-Hamiltonian Neural Networks

๐Ÿ“… 2025-02-04
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Purely data-driven neural networks for modeling nonlinear dynamical systems suffer from physical implausibility, poor extrapolation capability, and numerical instability. Method: We propose the first neural architecture deeply integrating port-Hamiltonian (pH) structure, explicitly embedding pH energy flow and structural constraints into the network design. This ensures global Lyapunov stability of the learned dynamics and encodes energy conservation/dissipation priors via physics-constrained parameterization, stability-aware regularization, and differentiable modeling. Contribution/Results: The method significantly improves generalization and numerical robustness under sparse-data regimes. Experiments demonstrate consistent superiority over purely data-driven baselines across multi-physics surrogate modeling tasksโ€”achieving higher accuracy, enhanced long-term stability, and physically consistent predictions. Our approach establishes a new paradigm for physics-guided learning in safety-critical applications.

Technology Category

Application Category

๐Ÿ“ Abstract
In recent years, nonlinear dynamic system identification using artificial neural networks has garnered attention due to its manifold potential applications in virtually all branches of science and engineering. However, purely data-driven approaches often struggle with extrapolation and may yield physically implausible forecasts. Furthermore, the learned dynamics can exhibit instabilities, making it difficult to apply such models safely and robustly. This article proposes stable port-Hamiltonian neural networks, a machine learning architecture that incorporates the physical biases of energy conservation or dissipation while guaranteeing global Lyapunov stability of the learned dynamics. Evaluations with illustrative examples and real-world measurement data demonstrate the model's ability to generalize from sparse data, outperforming purely data-driven approaches and avoiding instability issues. In addition, the model's potential for data-driven surrogate modeling is highlighted in application to multi-physics simulation data.
Problem

Research questions and friction points this paper is trying to address.

Addresses instability in neural network dynamics
Ensures global Lyapunov stability in learned models
Improves generalization from sparse data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Incorporates energy conservation biases
Guarantees global Lyapunov stability
Generalizes from sparse data
๐Ÿ”Ž Similar Papers
No similar papers found.