WGFINNs: Weak formulation-based GENERIC formalism informed neural networks'

πŸ“… 2026-04-02
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Accurately identifying governing equations from noisy observational data remains challenging, as existing methods relying on strong-form loss are highly sensitive to noise. This work proposes a physics-informed neural network that integrates the weak formulation of dynamical systems with a GENERIC structure-preserving architecture, replacing the conventional strong-form loss with a weak-form counterpart for the first time. This ensures strict satisfaction of GENERIC’s degeneracy conditions and symmetry constraints even in noisy environments. The approach incorporates state-weighted loss and a residual attention mechanism to enhance equation discovery accuracy. Theoretical analysis demonstrates that the proposed weak-form estimator is robust to noise under appropriate test functions, and experiments show it significantly outperforms GFINNs across various noise levels, yielding more accurate dynamical predictions and physical quantity reconstructions.
πŸ“ Abstract
Data-driven discovery of governing equations from noisy observations remains a fundamental challenge in scientific machine learning. While GENERIC formalism informed neural networks (GFINNs) provide a principled framework that enforces the laws of thermodynamics by construction, their reliance on strong-form loss formulations makes them highly sensitive to measurement noise. To address this limitation, we propose weak formulation-based GENERIC formalism informed neural networks (WGFINNs), which integrate the weak formulation of dynamical systems with the structure-preserving architecture of GFINNs. WGFINNs significantly enhance robustness to noisy data while retaining exact satisfaction of GENERIC degeneracy and symmetry conditions. We further incorporate a state-wise weighted loss and a residual-based attention mechanism to mitigate scale imbalance across state variables. Theoretical analysis contrasts quantitative differences between the strong-form and the weak-form estimators. Mainly, the strong-form estimator diverges as the time step decreases in the presence of noise, while the weak-form estimator can be accurate even with noisy data if test functions satisfy certain conditions. Numerical experiments demonstrate that WGFINNs consistently outperform GFINNs at varying noise levels, achieving more accurate predictions and reliable recovery of physical quantities.
Problem

Research questions and friction points this paper is trying to address.

scientific machine learning
governing equation discovery
measurement noise
GENERIC formalism
noisy observations
Innovation

Methods, ideas, or system contributions that make the work stand out.

weak formulation
GENERIC formalism
scientific machine learning
noise robustness
structure-preserving neural networks
πŸ”Ž Similar Papers
No similar papers found.