DGNO: A Novel Physics-aware Neural Operator for Solving Forward and Inverse PDE Problems based on Deep, Generative Probabilistic Modeling

📅 2025-02-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing neural operator methods for high-dimensional, discontinuous parametric PDEs—both forward and inverse problems—are limited by their reliance on large volumes of labeled data and the numerical instability or intractability of computing high-order derivatives. Method: We propose MultiONet, a novel neural operator architecture that (i) constructs weak-form residuals using compactly supported radial basis functions (CSRBFs) as virtual observations, thereby avoiding explicit high-order differentiation; (ii) jointly encodes PDE inputs and outputs via low-dimensional latent variables; and (iii) integrates deep generative modeling with hard physical constraints to enable unsupervised and semi-supervised learning. Contribution/Results: MultiONet achieves significantly improved solution accuracy on challenging PDE tasks—e.g., multiphase media—while exhibiting strong robustness to noise, out-of-distribution generalization, adaptability to sparse/noisy data, and built-in capability for uncertainty quantification.

Technology Category

Application Category

📝 Abstract
Solving parametric partial differential equations (PDEs) and associated PDE-based, inverse problems is a central task in engineering and physics, yet existing neural operator methods struggle with high-dimensional, discontinuous inputs and require large amounts of {em labeled} training data. We propose the Deep Generative Neural Operator (DGNO), a physics-aware framework that addresses these challenges by leveraging a deep, generative, probabilistic model in combination with a set of lower-dimensional, latent variables that simultaneously encode PDE-inputs and PDE-outputs. This formulation can make use of unlabeled data and significantly improves inverse problem-solving, particularly for discontinuous or discrete-valued input functions. DGNO enforces physics constraints without labeled data by incorporating as virtual observables, weak-form residuals based on compactly supported radial basis functions (CSRBFs). These relax regularity constraints and eliminate higher-order derivatives from the objective function. We also introduce MultiONet, a novel neural operator architecture, which is a more expressive generalization of the popular DeepONet that significantly enhances the approximating power of the proposed model. These innovations make DGNO particularly effective for challenging forward and inverse, PDE-based problems, such as those involving multi-phase media. Numerical experiments demonstrate that DGNO achieves higher accuracy across multiple benchmarks while exhibiting robustness to noise and strong generalization to out-of-distribution cases. Its adaptability, and the ability to handle sparse, noisy data while providing probabilistic estimates, make DGNO a powerful tool for scientific and engineering applications.
Problem

Research questions and friction points this paper is trying to address.

Solves parametric PDEs and inverse problems
Handles high-dimensional, discontinuous inputs effectively
Utilizes unlabeled data with improved accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Deep Generative Neural Operator
Physics-aware framework
Unlabeled data utilization
🔎 Similar Papers
No similar papers found.
Y
Yaohua Zang
Technical University of Munich, Professorship of Data-driven Materials Modeling, School of Engineering and Design, Boltzmannstr. 15, 85748 Garching, Germany
Phaedon-Stelios Koutsourelakis
Phaedon-Stelios Koutsourelakis
Professorship of Data-driven Materials Modeling
Probabilistic ModelingCoarse-grainingStatistical LearningInterface of knowledge- and data-driven learningUQ