🤖 AI Summary
This work proposes a unified flow-matching generative model for simulation-based inference that overcomes the inefficiency of traditional methods, which require separate model training for distinct query tasks such as posterior sampling or likelihood estimation. By introducing a query-aware masking distribution, the framework supports diverse conditional inference tasks within a single model without task-specific retraining. Leveraging an ordinary differential equation (ODE) solver, the approach enables efficient sampling and, for the first time, achieves general-purpose, efficient, and robust multi-task simulation-based inference within one unified architecture. Evaluated on ten benchmark problems and two high-dimensional real-world inverse problems, the method matches or exceeds state-of-the-art performance while significantly improving sampling efficiency and demonstrating strong robustness to missing data and noise.
📝 Abstract
We introduce \textit{OneFlowSBI}, a unified framework for simulation-based inference that learns a single flow-matching generative model over the joint distribution of parameters and observations. Leveraging a query-aware masking distribution during training, the same model supports multiple inference tasks, including posterior sampling, likelihood estimation, and arbitrary conditional distributions, without task-specific retraining. We evaluate \textit{OneFlowSBI} on ten benchmark inference problems and two high-dimensional real-world inverse problems across multiple simulation budgets. \textit{OneFlowSBI} is shown to deliver competitive performance against state-of-the-art generalized inference solvers and specialized posterior estimators, while enabling efficient sampling with few ODE integration steps and remaining robust under noisy and partially observed data.