A Primer on Variational Inference for Physics-Informed Deep Generative Modelling

📅 2024-09-10
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of uncertainty quantification in both forward and inverse problems within physics-informed deep generative models. Methodologically, it introduces a Bayesian modeling paradigm based on variational inference (VI), establishing for the first time systematic principles for constructing conditionally constrained variational distributions under physical laws. It develops a general framework for deriving the evidence lower bound (ELBO) tailored to partial differential equation (PDE)-driven problems and implements physics-embedded, scalable approximate inference via neural network parameterization. The core contributions are: (i) a theoretically grounded unification of VI and physics-based modeling, enabling improved uncertainty calibration and generalization in inverse solutions; and (ii) empirical validation across diverse PDE inverse problems, demonstrating robustness, computational efficiency, and resilience to observational noise and model misspecification.

Technology Category

Application Category

📝 Abstract
Variational inference (VI) is a computationally efficient and scalable methodology for approximate Bayesian inference. It strikes a balance between accuracy of uncertainty quantification and practical tractability. It excels at generative modelling and inversion tasks due to its built-in Bayesian regularisation and flexibility, essential qualities for physics related problems. Deriving the central learning objective for VI must often be tailored to new learning tasks where the nature of the problems dictates the conditional dependence between variables of interest, such as arising in physics problems. In this paper, we provide an accessible and thorough technical introduction to VI for forward and inverse problems, guiding the reader through standard derivations of the VI framework and how it can best be realized through deep learning. We then review and unify recent literature exemplifying the creative flexibility allowed by VI. This paper is designed for a general scientific audience looking to solve physics-based problems with an emphasis on uncertainty quantification.
Problem

Research questions and friction points this paper is trying to address.

Develop variational inference for physics-informed deep generative modeling
Balance accuracy and tractability in uncertainty quantification
Tailor VI learning for physics-based forward and inverse problems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Variational inference for scalable Bayesian modeling
Deep learning integration with physics-informed VI
Tailored VI derivations for physical dynamics
🔎 Similar Papers
No similar papers found.
A
Alex Glyn-Davies
Department of Engineering, University of Cambridge
A
A. Vadeboncoeur
Department of Engineering, University of Cambridge
O
O. D. Akyildiz
Department of Mathematics, Imperial College London
Ieva Kazlauskaite
Ieva Kazlauskaite
Department of Statistics, London School of Economics and Political Science
M
M. Girolami
The Alan Turing Institute