A domain decomposition-based autoregressive deep learning model for unsteady and nonlinear partial differential equations

📅 2024-08-26
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of large-scale, efficient modeling for nonlinear and nonstationary partial differential equations (PDEs), this paper proposes transient-CoMLSim—a novel deep learning framework. Methodologically, it introduces a coupled paradigm of “domain decomposition + latent-space autoregression”: spatial domains are partitioned into local subdomains, each modeling a low-dimensional solution manifold and conditional field representation; temporal evolution is then performed autoregressively in a shared latent space, stabilized via curriculum learning. The framework supports cross-scale extrapolation and generalization to arbitrary computational domain sizes. Experiments across diverse nonstationary PDE benchmarks demonstrate that transient-CoMLSim surpasses both Fourier Neural Operators (FNO) and U-Net in prediction accuracy, long-horizon rollout fidelity, and numerical stability, while substantially reducing computational complexity. It thus overcomes the scalability bottleneck inherent in existing deep learning–based physics simulators.

Technology Category

Application Category

📝 Abstract
In this paper, we propose a domain-decomposition-based deep learning (DL) framework, named transient-CoMLSim, for accurately modeling unsteady and nonlinear partial differential equations (PDEs). The framework consists of two key components: (a) a convolutional neural network (CNN)-based autoencoder architecture and (b) an autoregressive model composed of fully connected layers. Unlike existing state-of-the-art methods that operate on the entire computational domain, our CNN-based autoencoder computes a lower-dimensional basis for solution and condition fields represented on subdomains. Timestepping is performed entirely in the latent space, generating embeddings of the solution variables from the time history of embeddings of solution and condition variables. This approach not only reduces computational complexity but also enhances scalability, making it well-suited for large-scale simulations. Furthermore, to improve the stability of our rollouts, we employ a curriculum learning (CL) approach during the training of the autoregressive model. The domain-decomposition strategy enables scaling to out-of-distribution domain sizes while maintaining the accuracy of predictions -- a feature not easily integrated into popular DL-based approaches for physics simulations. We benchmark our model against two widely-used DL architectures, Fourier Neural Operator (FNO) and U-Net, and demonstrate that our framework outperforms them in terms of accuracy, extrapolation to unseen timesteps, and stability for a wide range of use cases.
Problem

Research questions and friction points this paper is trying to address.

Modeling unsteady nonlinear PDEs
Domain-decomposition deep learning
Enhancing computational scalability stability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Domain decomposition for PDE modeling
CNN-based autoencoder reduces dimensionality
Curriculum learning enhances model stability
🔎 Similar Papers
No similar papers found.
Sheel Nidhan
Sheel Nidhan
Ansys, Inc., San Jose, CA
H
Haoliang Jiang
Ansys, Inc., San Jose, CA
L
Lalit Ghule
Ansys, Inc., Canonsburg, PA
C
C. Umphrey
Ansys, Inc., Salt Lake City, UT
Rishikesh Ranade
Rishikesh Ranade
Senior Engineer - Physics ML, NVIDIA
Machine LearningComputational Science
J
Jay Pathak
Ansys, Inc., San Jose, CA