Nonparametric estimation of a factorizable density using diffusion models

📅 2025-01-03
📈 Citations: 4
Influential: 1
📄 PDF
🤖 AI Summary
To address the curse of dimensionality in high-dimensional nonparametric density estimation, this paper considers densities exhibiting a low-dimensional factorized structure—i.e., statistical independence across variable groups. We propose diffusion models as implicit density estimators and, for the first time within a statistical framework, establish that under the factorization assumption, the resulting estimator achieves a dimension-free minimax-optimal convergence rate in total variation distance—thereby circumventing the curse of dimensionality. To explicitly encode structural priors, we design a sparse weight-sharing neural network architecture that adaptively models low-dimensional components. Theoretical analysis confirms the improved statistical efficiency, while empirical results demonstrate superior estimation accuracy and enhanced interpretability in high-dimensional sparse settings—all without sacrificing the flexibility inherent to nonparametric methods.

Technology Category

Application Category

📝 Abstract
In recent years, diffusion models, and more generally score-based deep generative models, have achieved remarkable success in various applications, including image and audio generation. In this paper, we view diffusion models as an implicit approach to nonparametric density estimation and study them within a statistical framework to analyze their surprising performance. A key challenge in high-dimensional statistical inference is leveraging low-dimensional structures inherent in the data to mitigate the curse of dimensionality. We assume that the underlying density exhibits a low-dimensional structure by factorizing into low-dimensional components, a property common in examples such as Bayesian networks and Markov random fields. Under suitable assumptions, we demonstrate that an implicit density estimator constructed from diffusion models adapts to the factorization structure and achieves the minimax optimal rate with respect to the total variation distance. In constructing the estimator, we design a sparse weight-sharing neural network architecture, where sparsity and weight-sharing are key features of practical architectures such as convolutional neural networks and recurrent neural networks.
Problem

Research questions and friction points this paper is trying to address.

Estimating high-dimensional densities using diffusion models' implicit approach
Leveraging factorizable low-dimensional structures to overcome dimensionality curse
Achieving minimax optimal rates with sparse weight-sharing neural architectures
Innovation

Methods, ideas, or system contributions that make the work stand out.

Nonparametric density estimation via diffusion models
Sparse weight-sharing neural network architecture
Adapts to low-dimensional factorization structure
🔎 Similar Papers
No similar papers found.