An invertible generative model for forward and inverse problems

📅 2025-09-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the joint solution of forward (likelihood sampling) and inverse (posterior sampling) problems within the Bayesian framework. Methodologically, it introduces a novel invertible generative model that constructs a bijective mapping between parameter and observation spaces via stacked upper- and lower-triangular normalizing flows, and proposes a unified training objective for bidirectional conditional sampling—enabling end-to-end optimization of both forward simulation and posterior inference. Its key contribution lies in the first integration of invertible neural networks with Bayesian conditional generation, jointly modeling likelihood and posterior distributions without approximation bias inherent in traditional MCMC or variational inference. Across multiple numerical experiments, the model achieves high-fidelity simultaneous forward generation and inverse inference, significantly improving computational efficiency and statistical consistency of the Bayesian simulation–inference loop.

Technology Category

Application Category

📝 Abstract
We formulate the inverse problem in a Bayesian framework and aim to train a generative model that allows us to simulate (i.e., sample from the likelihood) and do inference (i.e., sample from the posterior). We review the use of triangular normalizing flows for conditional sampling in this context and show how to combine two such triangular maps (an upper and a lower one) in to one invertible mapping that can be used for simulation and inference. We work out several useful properties of this invertible generative model and propose a possible training loss for training the map directly. We illustrate the workings of this new approach to conditional generative modeling numerically on a few stylized examples.
Problem

Research questions and friction points this paper is trying to address.

Solving Bayesian inverse problems with generative models
Combining triangular maps for invertible simulation and inference
Training invertible mappings for conditional sampling tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian framework for simulation and inference
Combining triangular normalizing flows for invertible mapping
Direct training loss for conditional generative modeling
🔎 Similar Papers
No similar papers found.