GIMLET: Generalizable and Interpretable Model Learning through Embedded Thermodynamics

📅 2025-12-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the gray-box discovery of governing equations for fluid flow and scalar transport, specifically targeting the data-driven inference of unknown constitutive closure terms—such as turbulent stresses and diffusive fluxes—under known time derivatives, advection, and pressure gradient terms, while ensuring thermodynamic consistency. We propose a data-driven framework grounded in the variational principle of non-equilibrium thermodynamics: for the first time, free energy and dissipation functionals are parameterized via differentiable neural networks, and governing equations are rigorously derived via automatic differentiation to inherently satisfy thermodynamic constraints—monotonic free energy decay and non-negative entropy production. The method requires no predefined function library, offering both strong generalizability and physical interpretability. It is validated on the Burgers equation, Kuramoto–Sivashinsky equation, and Newtonian/non-Newtonian Navier–Stokes systems, and successfully generalizes across diverse operating conditions.

Technology Category

Application Category

📝 Abstract
We develop a data-driven framework for discovering constitutive relations in models of fluid flow and scalar transport. Our approach infers unknown closure terms in the governing equations (gray-box discovery) under the assumption that the temporal derivative, convective transport, and pressure-gradient contributions are known. The formulation is rooted in a variational principle from nonequilibrium thermodynamics, where the dynamics is defined by a free-energy functional and a dissipation functional. The unknown constitutive terms arise as functional derivatives of these functionals with respect to the state variables. To enable a flexible and structured model discovery, the free-energy and dissipation functionals are parameterized using neural networks, while their functional derivatives are obtained via automatic differentiation. This construction enforces thermodynamic consistency by design, ensuring monotonic decay of the total free energy and non-negative entropy production. The resulting method, termed GIMLET (Generalizable and Interpretable Model Learning through Embedded Thermodynamics), avoids reliance on a predefined library of candidate functions, unlike sparse regression or symbolic identification approaches. The learned models are generalizable in that functionals identified from one dataset can be transferred to distinct datasets governed by the same underlying equations. Moreover, the inferred free-energy and dissipation functions provide direct physical interpretability of the learned dynamics. The framework is demonstrated on several benchmark systems, including the viscous Burgers equation, the Kuramoto--Sivashinsky equation, and the incompressible Navier--Stokes equations for both Newtonian and non-Newtonian fluids.
Problem

Research questions and friction points this paper is trying to address.

Discovers unknown closure terms in fluid flow and scalar transport models.
Ensures thermodynamic consistency in learned models using neural networks.
Provides interpretable and generalizable dynamics without predefined function libraries.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses neural networks to parameterize free-energy and dissipation functionals
Enforces thermodynamic consistency via automatic differentiation of functionals
Learns generalizable models without predefined candidate function libraries
🔎 Similar Papers
No similar papers found.
S
Suguru Shiratori
Department of Mechanical Systems Engineering, Tokyo City University, 1-28-1 Tamazutsumi, Setagaya-ku, 158-8557 Tokyo, Japan
E
Elham Kiyani
Division of Applied Mathematics, Brown University, 182 George Street, Providence, 02912, RI, USA
Khemraj Shukla
Khemraj Shukla
Rice University
Applied MathematicsPDEMachine LearningHPC
George Em Karniadakis
George Em Karniadakis
The Charles Pitts Robinson and John Palmer Barstow Professor of Applied Mathematics and Engineering
Math+Machine LearningProbabilistic Scientific ComputingStochastic Multiscale Modeling