Score-Debiased Kernel Density Estimation

📅 2025-04-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the dominant-order bias inherent in Silverman’s rule-of-thumb bandwidth selector for kernel density estimation (KDE). To mitigate this bias, we propose a novel debiased KDE method grounded in the score function (the gradient of the log-density): for each sample, a single-step gradient correction is applied along a noise-robust score estimate direction, jointly optimized with adaptive bandwidth adjustment—thereby theoretically eliminating the dominant-order term in the mean integrated squared error (MISE). This work constitutes the first integration of the score function into the nonparametric density estimation framework, enabling theory-driven bias control. Empirical evaluation on 1D and 2D synthetic datasets and MNIST image data demonstrates consistent and significant MISE reduction compared to the Silverman baseline. Crucially, the method retains robust performance gains even under noisy score estimation, confirming its practical reliability.

Technology Category

Application Category

📝 Abstract
We propose a novel method for density estimation that leverages an estimated score function to debias kernel density estimation (SD-KDE). In our approach, each data point is adjusted by taking a single step along the score function with a specific choice of step size, followed by standard KDE with a modified bandwidth. The step size and modified bandwidth are chosen to remove the leading order bias in the KDE. Our experiments on synthetic tasks in 1D, 2D and on MNIST, demonstrate that our proposed SD-KDE method significantly reduces the mean integrated squared error compared to the standard Silverman KDE, even with noisy estimates in the score function. These results underscore the potential of integrating score-based corrections into nonparametric density estimation.
Problem

Research questions and friction points this paper is trying to address.

Debiasing kernel density estimation using score function
Reducing mean integrated squared error in density estimation
Improving accuracy with score-based corrections in KDE
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses score function to debias kernel density estimation
Adjusts data points with specific step size
Modifies bandwidth to reduce leading order bias
🔎 Similar Papers
No similar papers found.
Elliot L. Epstein
Elliot L. Epstein
PhD student, Stanford University
Deep learningmachine learning
R
Rajat Dwaraknath
Stanford University, Stanford, CA 94305, USA
T
Thanawat Sornwanee
Stanford University, Stanford, CA 94305, USA
J
John Winnicki
Stanford University, Stanford, CA 94305, USA
J
Jerry Weihong Liu
Stanford University, Stanford, CA 94305, USA