🤖 AI Summary
Data-driven turbulence closure models exhibit poor generalizability on unstructured/irregular meshes. Method: This paper proposes the first end-to-end differentiable, physics-informed framework that tightly couples a graph neural network (GNN) into an incompressible Navier–Stokes finite-element solver, enabling joint optimization of subgrid-scale closures. Contribution/Results: It establishes the first gradient-through-training pipeline integrating GNNs with a differentiable CFD solver—eliminating reliance on large annotated datasets while learning physically consistent, multi-scale-faithful, and cross-dimensional (2D/3D) transferable closure relations. Evaluated on the backward-facing step turbulent flow benchmark, the model accurately reproduces key statistical quantities—including Reynolds stresses, energy spectra, and vortex structures—with low prediction error and numerical stability. Crucially, it demonstrates strong generalization to complex geometries and unstructured meshes.
📝 Abstract
Differentiable physical simulators are proving to be valuable tools for developing data-driven models for computational fluid dynamics (CFD). In particular, these simulators enable end-to-end training of machine learning (ML) models embedded within CFD solvers. This paradigm enables novel algorithms which combine the generalization power and low cost of physics-based simulations with the flexibility and automation of deep learning methods. In this study, we introduce a framework for embedding deep learning models within a finite element solver for incompressible Navier-Stokes equations, specifically applying this approach to learn a subgrid-scale (SGS) closure with a graph neural network (GNN). We first demonstrate the feasibility of the approach on flow over a two-dimensional backward-facing step, using it as a proof of concept to show that solver-consistent training produces stable and physically meaningful closures. Then, we extend this to a turbulent flow over a three-dimensional backward-facing step. In this setting, the GNN-based closure not only attains low prediction errors, but also recovers key turbulence statistics and preserves multiscale turbulent structures. We further demonstrate that the closure can be identified in data-limited learning scenarios as well. Overall, the proposed end-to-end learning paradigm offers a viable pathway toward physically consistent and generalizable data-driven SGS modeling on complex and unstructured domains.