🤖 AI Summary
This work proposes a Bayesian generative modeling (BGM) framework that overcomes the limitations of existing conditional inference methods, which are typically constrained by fixed conditioning structures and cannot flexibly support arbitrary variable partitions after training. By leveraging a stochastic iterative Bayesian updating algorithm to learn the joint distribution, the method enables a single model to perform probabilistic inference under any conditional partition without retraining. Integrating deep generative modeling with principled Bayesian inference, the approach provides theoretical guarantees on convergence, statistical consistency, and conditional risk bounds. Empirical evaluations demonstrate that the model not only achieves superior predictive performance across multiple tasks but also yields well-calibrated prediction intervals, confirming its effectiveness as a general-purpose engine for conditional inference.
📝 Abstract
Modern data analysis increasingly requires flexible conditional inference P(X_B | X_A) where (X_A, X_B) is an arbitrary partition of observed variable X. Existing conditional inference methods lack this flexibility as they are tied to a fixed conditioning structure and cannot perform new conditional inference once trained. To solve this, we propose a Bayesian generative modeling (BGM) approach for arbitrary conditional inference without retraining. BGM learns a generative model of X through an iterative Bayesian updating algorithm where model parameters and latent variables are updated until convergence. Once trained, any conditional distribution can be obtained without retraining. Empirically, BGM achieves superior prediction performance with well calibrated predictive intervals, demonstrating that a single learned model can serve as a universal engine for conditional prediction with uncertainty quantification. We provide theoretical guarantees for the convergence of the stochastic iterative algorithm, statistical consistency and conditional-risk bounds. The proposed BGM framework leverages the power of AI to capture complex relationships among variables while adhering to Bayesian principles, emerging as a promising framework for advancing various applications in modern data science. The code for BGM is freely available at https://github.com/liuq-lab/bayesgm.