LVM-GP: Uncertainty-Aware PDE Solver via coupling latent variable model and Gaussian process

📅 2025-07-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses forward and inverse partial differential equation (PDE) problems with noisy observational data by proposing an uncertainty-aware probabilistic solving framework. Methodologically, it integrates a latent-variable Gaussian process prior with neural operators within a confidence-aware encoder–probabilistic decoder architecture: the encoder learns a data-driven distribution over latent variables, while the decoder employs a neural operator to output a conditional Gaussian distribution over the solution field; a learnable confidence function dynamically fuses deterministic features with stochastic priors, and physical laws are incorporated via soft constraints. Compared to Bayesian physics-informed neural networks (B-PINNs) and deep ensembles, the framework achieves significantly improved predictive accuracy and better-calibrated uncertainty quantification across multiple numerical benchmarks—while maintaining high modeling efficiency and robustness to data noise.

Technology Category

Application Category

📝 Abstract
We propose a novel probabilistic framework, termed LVM-GP, for uncertainty quantification in solving forward and inverse partial differential equations (PDEs) with noisy data. The core idea is to construct a stochastic mapping from the input to a high-dimensional latent representation, enabling uncertainty-aware prediction of the solution. Specifically, the architecture consists of a confidence-aware encoder and a probabilistic decoder. The encoder implements a high-dimensional latent variable model based on a Gaussian process (LVM-GP), where the latent representation is constructed by interpolating between a learnable deterministic feature and a Gaussian process prior, with the interpolation strength adaptively controlled by a confidence function learned from data. The decoder defines a conditional Gaussian distribution over the solution field, where the mean is predicted by a neural operator applied to the latent representation, allowing the model to learn flexible function-to-function mapping. Moreover, physical laws are enforced as soft constraints in the loss function to ensure consistency with the underlying PDE structure. Compared to existing approaches such as Bayesian physics-informed neural networks (B-PINNs) and deep ensembles, the proposed framework can efficiently capture functional dependencies via merging a latent Gaussian process and neural operator, resulting in competitive predictive accuracy and robust uncertainty quantification. Numerical experiments demonstrate the effectiveness and reliability of the method.
Problem

Research questions and friction points this paper is trying to address.

Uncertainty quantification in solving noisy PDEs
Stochastic mapping for uncertainty-aware predictions
Enforcing physical laws as soft constraints
Innovation

Methods, ideas, or system contributions that make the work stand out.

Couples latent variable model with Gaussian process
Uses confidence-aware encoder and probabilistic decoder
Enforces physical laws as soft constraints
🔎 Similar Papers
No similar papers found.
Xiaodong Feng
Xiaodong Feng
Sun Yat-sen University
social mediadata mining
L
Ling Guo
Department of Mathematics, Shanghai Normal University, Shanghai, China.
Xiaoliang Wan
Xiaoliang Wan
Louisiana State University
Applied and Computational Mathematics
H
Hao Wu
School of Mathematical Sciences, Institute of Natural Sciences, and MOE-LSC, Shanghai Jiaotong University, Shanghai, China.
T
Tao Zhou
Institute of Computational Mathematics and Scientific /Engineering Computing, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing, China
W
Wenwen Zhou
Department of Mathematics, Shanghai Normal University, Shanghai, China.