🤖 AI Summary
This work addresses the curse of dimensionality in high-dimensional multivariate density estimation. We propose the Variance-Reduced Sketching (VRS) framework, which models the density function as an infinite-dimensional tensor and integrates randomized sketching—inspired by numerical linear algebra—with variance reduction techniques, marking the first systematic incorporation of variance control principles into density estimation sketch design. Theoretically, VRS substantially alleviates dimension dependence, achieving a convergence rate that grows sublinearly with dimensionality. Empirically, on diverse synthetic and real-world datasets, VRS reduces estimation bias by 30–50% compared to state-of-the-art neural density estimators and kernel-based methods, thereby overcoming longstanding trade-offs between accuracy and scalability in nonparametric density estimation.
📝 Abstract
Multivariate density estimation is of great interest in various scientific and engineering disciplines. In this work, we introduce a new framework called Variance-Reduced Sketching (VRS), specifically designed to estimate multivariate density functions with a reduced curse of dimensionality. Our VRS framework conceptualizes multivariate functions as infinite-size matrices/tensors, and facilitates a new sketching technique motivated by the numerical linear algebra literature to reduce the variance in density estimation problems. We demonstrate the robust numerical performance of VRS through a series of simulated experiments and real-world data applications. Notably, VRS shows remarkable improvement over existing neural network density estimators and classical kernel methods in numerous distribution models. Additionally, we offer theoretical justifications for VRS to support its ability to deliver density estimation with a reduced curse of dimensionality.