🤖 AI Summary
Bayesian inverse problems often yield intractable posterior distributions, and conventional sampling methods suffer from high computational cost and poor generalization to variable-length observations. To address these challenges, we propose a novel posterior modeling framework that integrates Conditional Flow Matching (CFM) with a sequence-aware Transformer architecture. This is the first work to combine CFM with a Transformer encoder, leveraging neural ordinary differential equations (neural ODEs) for continuous-time flow parameterization and introducing a variable-length observation embedding mechanism to enable efficient, well-calibrated posterior sampling across arbitrary observation lengths. Evaluated on multiple physics-based inverse problem benchmarks, our method achieves 5–10× faster posterior generation than state-of-the-art samplers while preserving high fidelity and probabilistic calibration. The approach significantly enhances the practicality and scalability of Bayesian inversion.
📝 Abstract
Solving Bayesian inverse problems efficiently remains a significant challenge due to the complexity of posterior distributions and the computational cost of traditional sampling methods. Given a series of observations and the forward model, we want to recover the distribution of the parameters, conditioned on observed experimental data. We show, that combining Conditional Flow Mathching (CFM) with transformer-based architecture, we can efficiently sample from such kind of distribution, conditioned on variable number of observations.