๐ค AI Summary
This work addresses the persistent challenge of achieving long-term, high-fidelity simulations of complex fluids near solid boundaries, where conventional methods are computationally prohibitive and purely data-driven models suffer from error accumulation and poor extrapolation robustness. The authors propose a physics-embedded, end-to-end differentiable framework that tightly integrates the pressure-projection scheme of the incompressible NavierโStokes equations, a multi-directional direct-forcing immersed boundary method, and neural differential modeling. A ConvResNet architecture replaces the costly pressure solver, while a sub-iteration strategy decouples physical stability from the time-step size, enabling efficient training with only single-step supervision. Evaluated on flow past a cylinder at Reynolds number 100, the model surpasses existing baselines in both fidelity and long-term stability, achieves approximately 200ร faster inference than high-resolution solvers, and requires less than one hour of training on a single GPU.
๐ Abstract
Accurately, efficiently, and stably computing complex fluid flows and their evolution near solid boundaries over long horizons remains challenging. Conventional numerical solvers require fine grids and small time steps to resolve near-wall dynamics, resulting in high computational costs, while purely data-driven surrogate models accumulate rollout errors and lack robustness under extrapolative conditions. To address these issues, this study extends existing neural PDE solvers by developing a physics-integrated differentiable framework for long-horizon prediction of immersed-boundary flows. A key design aspect of the framework includes an important improvement, namely the structural integration of physical principles into an end-to-end differentiable architecture incorporating a PDE-based intermediate velocity module and a multi-direct forcing immersed boundary module, both adhering to the pressure-projection procedure for incompressible flow computation. The computationally expensive pressure projection step is substituted with a learned implicit correction using ConvResNet blocks to reduce cost, and a sub-iteration strategy is introduced to separate the embedded physics module's stability requirement from the surrogate model's time step, enabling stable coarse-grid autoregressive rollouts with large effective time increments. The framework uses only single-step supervision for training, eliminating long-horizon backpropagation and reducing training time to under one hour on a single GPU. Evaluations on benchmark cases of flow past a stationary cylinder and a rotationally oscillating cylinder at Re=100 show the proposed model consistently outperforms purely data-driven, physics-loss-constrained, and coarse-grid numerical baselines in flow-field fidelity and long-horizon stability, while achieving an approximately 200-fold inference speedup over the high-resolution solver.