Estimating Complex Densities using Two-Stage Normalizing Flows

📅 2026-03-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenging problem of density estimation for complex target distributions that combine an analytically tractable component with a non-analytic component accessible only through samples from a simulator or dataset, where direct sampling or closed-form evaluation is infeasible. The authors propose a two-stage normalizing flow framework: the first stage learns the density of the non-analytic component from samples, and the second stage integrates this estimate with the analytic term to reconstruct the full target distribution, enabling efficient sampling and density evaluation. This approach uniquely unifies heterogeneous information—sample-driven data and analytic priors—allowing stable approximate inference without requiring access to the complete target density or joint samples. Experiments on Bayesian hierarchical models, joint density estimation, and large-scale astronomical data demonstrate its ability to accurately recover highly nonlinear structures, significantly outperforming existing methods and confirming its robustness and practical utility.

Technology Category

Application Category

📝 Abstract
In many scientific applications, the target probability distribution cannot be evaluated in closed form or sampled from directly. Instead, it can often be decomposed into multiple components, some of which are accessible only through samples generated by simulators or external datasets, while others admit tractable mathematical expressions or are specified through statistical assumptions about variable relationships. Developing inference methods that coherently integrate these heterogeneous sources of information remains an open challenge. In this paper, we propose a Two-Stage Normalizing Flows framework for approximating and sampling from such distributions. The method first learns the densities of components for which only samples are available, and then combines the outputs with the analytically specified terms to reconstruct the full target distribution in a second stage. The resulting model enables both point-wise density evaluation and efficient generation of representative samples, without requiring direct access to the full target density or joint samples from the complete model. We assess the proposed approach through simulation studies in joint density inference and Bayesian hierarchical models with inaccessible likelihoods. The proposed framework is able to accurately recover complex, highly nonlinear target structures using only partial information about the target density, providing stable and flexible approximations in settings where standard modeling assumptions do not hold (or when complete access to the target distribution is not available). Analysis of a large scale astronomy application highlights interesting differences between our method and existing approaches. Our normalizing flows procedure offers a robust and flexible approach to inference for intractable target distributions across both simulated and real-world applications.
Problem

Research questions and friction points this paper is trying to address.

intractable distributions
heterogeneous information
density estimation
normalizing flows
Bayesian inference
Innovation

Methods, ideas, or system contributions that make the work stand out.

Two-Stage Normalizing Flows
Intractable Density Estimation
Heterogeneous Information Integration
Simulation-Based Inference
Bayesian Hierarchical Models
🔎 Similar Papers
2024-02-09International Conference on Machine LearningCitations: 4
R
Roxana Darvishi
Department of Statistics and Actuarial Science, Simon Fraser University
D
David C. Stenning
Department of Statistics and Actuarial Science, Simon Fraser University
Ted von Hippel
Ted von Hippel
Embry-Riddle Aeronautical University
AstronomyAstrophysics
O
Owen G. Ward
Department of Statistics and Actuarial Science, Simon Fraser University