Thermodynamically Consistent Latent Dynamics Identification for Parametric Systems

📅 2025-06-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Model reduction for parametrized nonlinear dynamical systems often sacrifices thermodynamic consistency for computational efficiency. Method: We propose a thermodynamically consistent paradigm: parametrized GENERIC-form-informed neural networks (pGFINNs), integrated with variational autoencoders for state-space compression and augmented by a residual-driven, thermodynamics-aware active learning strategy—ensuring strict satisfaction of free-energy conservation and nonnegative entropy production across the parameter space. Contribution/Results: The framework achieves both physical interpretability and high data efficiency. Evaluated on the Burgers and Vlasov–Poisson equations, it delivers up to 3528× speedup with only 1–3% relative error, reduces training costs by 50–90%, and cuts inference costs by 57–61%. It significantly enhances cross-parameter generalization and uncovers underlying thermodynamic mechanisms.

Technology Category

Application Category

📝 Abstract
We propose an efficient thermodynamics-informed latent space dynamics identification (tLaSDI) framework for the reduced-order modeling of parametric nonlinear dynamical systems. This framework integrates autoencoders for dimensionality reduction with newly developed parametric GENERIC formalism-informed neural networks (pGFINNs), which enable efficient learning of parametric latent dynamics while preserving key thermodynamic principles such as free energy conservation and entropy generation across the parameter space. To further enhance model performance, a physics-informed active learning strategy is incorporated, leveraging a greedy, residual-based error indicator to adaptively sample informative training data, outperforming uniform sampling at equivalent computational cost. Numerical experiments on the Burgers' equation and the 1D/1V Vlasov-Poisson equation demonstrate that the proposed method achieves up to 3,528x speed-up with 1-3% relative errors, and significant reduction in training (50-90%) and inference (57-61%) cost. Moreover, the learned latent space dynamics reveal the underlying thermodynamic behavior of the system, offering valuable insights into the physical-space dynamics.
Problem

Research questions and friction points this paper is trying to address.

Identify parametric latent dynamics preserving thermodynamics principles
Enhance model performance with physics-informed active learning
Achieve computational efficiency in reduced-order modeling of dynamical systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Autoencoders reduce dimensions for dynamics modeling
pGFINNs preserve thermodynamic principles in learning
Active learning adaptively samples informative training data
🔎 Similar Papers
No similar papers found.