Hybrid Bernstein Normalizing Flows for Flexible Multivariate Density Regression with Interpretable Marginals

📅 2025-05-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In multivariate density regression, existing methods struggle to simultaneously achieve model flexibility and marginal interpretability: MCTMs offer interpretability but limited expressive power, while normalizing flows (NFs) are highly flexible yet lack semantic interpretability. This paper proposes the MCTM-Bernstein Flow—a hybrid architecture that integrates the interpretable marginal modeling of MCTMs with an autoregressive, Bernstein-polynomial-based normalizing flow. It establishes, for the first time, semantic alignment between neural flows and classical statistical models in density regression. The method preserves the interpretability of marginal distribution parameters while substantially enhancing modeling capacity for joint dependence structures. Experiments demonstrate a >35% reduction in KL divergence over both pure MCTMs and standard NFs on synthetic and real-world datasets. Moreover, the framework supports counterfactual inference, uncertainty quantification, and precise attribution of marginal effects.

Technology Category

Application Category

📝 Abstract
Density regression models allow a comprehensive understanding of data by modeling the complete conditional probability distribution. While flexible estimation approaches such as normalizing flows (NF) work particularly well in multiple dimensions, interpreting the input-output relationship of such models is often difficult, due to the black-box character of deep learning models. In contrast, existing statistical methods for multivariate outcomes such as multivariate conditional transformation models (MCTM) are restricted in flexibility and are often not expressive enough to represent complex multivariate probability distributions. In this paper, we combine MCTM with state-of-the-art and autoregressive NF to leverage the transparency of MCTM for modeling interpretable feature effects on the marginal distributions in the first step and the flexibility of neural-network-based NF techniques to account for complex and non-linear relationships in the joint data distribution. We demonstrate our method's versatility in various numerical experiments and compare it with MCTM and other NF models on both simulated and real-world data.
Problem

Research questions and friction points this paper is trying to address.

Combining interpretable marginal models with flexible normalizing flows
Addressing limitations of existing multivariate density regression methods
Enhancing transparency and flexibility in modeling complex distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines MCTM with autoregressive normalizing flows
Models interpretable marginal distributions first
Uses neural networks for complex joint distributions
🔎 Similar Papers
No similar papers found.
Marcel Arpogaus
Marcel Arpogaus
HTWG Konstanz
Machine LearningSmart Power GridOptimisationProbabilistic ForecastingDensity Regression
T
T. Kneib
Chair of Statistics and Campus Institute Data Science (CIDAS) , University of Göttingen , Göttingen, GERMANY
Thomas Nagler
Thomas Nagler
LMU Munich, Munich Center for Machine Learning
mathematical statisticsstatistical learningcomputational statisticscopulas
D
David Rugamer
Department of Statistics , LMU Munich, Munich, GERMANY