🤖 AI Summary
Existing hybrid neural-physics models suffer from numerical instability and error accumulation in long-horizon spatiotemporal dynamics forecasting. To address this, we propose Im-PiNDiff, an implicit neural differential model that introduces the first fixed-point-based implicit layer for physics-integrated differentiable solving—eliminating stability limitations inherent in explicit recursive schemes. Our method employs a hybrid gradient propagation strategy that decouples memory cost from iteration count, augmented with checkpointing to enable efficient long-horizon rollout training. It unifies deep equilibrium modeling, the adjoint method, reverse-mode automatic differentiation, and physics-informed neural networks (PINNs) with PDE constraints. Evaluated on the convection-diffusion equation, Burgers’ equation, and a multi-physics chemical vapor infiltration task, Im-PiNDiff achieves substantial improvements in prediction accuracy, reduces training memory consumption by over 60%, cuts inference time by 45%, and significantly enhances long-term simulation stability.
📝 Abstract
Hybrid neural-physics modeling frameworks through differentiable programming have emerged as powerful tools in scientific machine learning, enabling the integration of known physics with data-driven learning to improve prediction accuracy and generalizability. However, most existing hybrid frameworks rely on explicit recurrent formulations, which suffer from numerical instability and error accumulation during long-horizon forecasting. In this work, we introduce Im-PiNDiff, a novel implicit physics-integrated neural differentiable solver for stable and accurate modeling of spatiotemporal dynamics. Inspired by deep equilibrium models, Im-PiNDiff advances the state using implicit fixed-point layers, enabling robust long-term simulation while remaining fully end-to-end differentiable. To enable scalable training, we introduce a hybrid gradient propagation strategy that integrates adjoint-state methods with reverse-mode automatic differentiation. This approach eliminates the need to store intermediate solver states and decouples memory complexity from the number of solver iterations, significantly reducing training overhead. We further incorporate checkpointing techniques to manage memory in long-horizon rollouts. Numerical experiments on various spatiotemporal PDE systems, including advection-diffusion processes, Burgers' dynamics, and multi-physics chemical vapor infiltration processes, demonstrate that Im-PiNDiff achieves superior predictive performance, enhanced numerical stability, and substantial reductions in memory and runtime cost relative to explicit and naive implicit baselines. This work provides a principled, efficient, and scalable framework for hybrid neural-physics modeling.