Generative Neural Operators through Diffusion Last Layer

📅 2026-02-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing neural operators struggle to perform reliable uncertainty quantification for stochastic partial differential equations (SPDEs). To address this limitation, this work proposes Diffusion Last Layer (DLL), a lightweight, plug-and-play probabilistic head that can be integrated with any neural operator backbone. DLL introduces, for the first time, a low-rank Karhunen–Loève expansion in function space to efficiently model the output distribution. This approach significantly enhances generalization and uncertainty-aware performance in SPDE operator learning, while also improving the stability of long-time rollouts and the calibration of epistemic uncertainty estimates.

Technology Category

Application Category

📝 Abstract
Neural operators have emerged as a powerful paradigm for learning discretization-invariant function-to-function mappings in scientific computing. However, many practical systems are inherently stochastic, making principled uncertainty quantification essential for reliable deployment. To address this, we introduce a simple add-on, the diffusion last layer (DLL), a lightweight probabilistic head that can be attached to arbitrary neural operator backbones to model predictive uncertainty. Motivated by the relative smoothness and low-dimensional structure often exhibited by PDE solution distributions, DLL parameterizes the conditional output distribution directly in function space through a low-rank Karhunen-Lo\`eve expansion, enabling efficient and expressive uncertainty modeling. Across stochastic PDE operator learning benchmarks, DLL improves generalization and uncertainty-aware prediction. Moreover, even in deterministic long-horizon rollout settings, DLL enhances rollout stability and provides meaningful estimates of epistemic uncertainty for backbone neural operators.
Problem

Research questions and friction points this paper is trying to address.

neural operators
uncertainty quantification
stochastic PDEs
function space
predictive uncertainty
Innovation

Methods, ideas, or system contributions that make the work stand out.

neural operators
uncertainty quantification
diffusion last layer
Karhunen-Loève expansion
stochastic PDEs
🔎 Similar Papers
No similar papers found.
Sungwon Park
Sungwon Park
KAIST
Social ComputingAI for Social Good
Anthony Zhou
Anthony Zhou
PhD Candidate, Carnegie Mellon University
Scientific Machine Learning
H
Hongjoong Kim
Korea University, Seoul, South Korea
A
A. Farimani
Carnegie Mellon University, Pittsburgh, USA