🤖 AI Summary
This work addresses the longstanding limitation in conditional density estimation—namely, the absence of closed-form solutions for multivariate conditional densities under non-Gaussian assumptions. We propose a generative conditional density estimation framework grounded in copula modeling and analytic conditionalization in latent space. Methodologically, we first establish the inheritability of “conditional stability” under mixture and transformation operations, thereby extending analytically tractable conditional families to non-Gaussian, nonlinear, and cross-dimensional settings. The core components include a Gaussian Mixture Copula Model (GMCM), an explicit latent-space conditionalization mechanism, and joint copula modeling. Experiments on synthetic and real-world datasets demonstrate substantial improvements in conditional density estimation accuracy and robustness to missing data imputation. Crucially, our approach enables efficient, differentiable, and sampling-free deterministic conditional inference.
📝 Abstract
We address the challenge of conditioning multivariate densities, extending analytical conditioning results far beyond the Gaussian case. We review and discuss families of multivariate distributions that do enjoy analytical conditioning, also providing a few counter-examples. Proving that transdimensional stability under conditioning extends to mixtures and transformations, we demonstrate that a broader class of multivariate distributions inherit easy conditioning properties. Building on this insight, we developed a generative method to estimate conditional distributions from data by first fitting a flexible joint distribution using copulas and then performing analytical conditioning in a latent space. We specifically apply this methodology to Gaussian Mixture Copula Models (GMCM) and examine various fitting strategies. Through simulations and real-world data experiments, we showcase the efficacy of our method in tasks involving conditional density estimation and data imputation.