SVD Provably Denoises Nearest Neighbor Data

📅 2026-04-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the problem of nearest neighbor recovery in high-dimensional Gaussian noise when data lie on an unknown low-dimensional subspace. The authors propose a denoising approach based on singular value decomposition (SVD) and provide the first rigorous theoretical guarantee for SVD-based nearest neighbor recovery that does not require the noise level to decay polynomially with the ambient dimension. They prove that exact nearest neighbor recovery is achievable when the noise standard deviation satisfies σ = O(1/k^{1/4}), where k is the subspace dimension, while recovery becomes information-theoretically impossible when σ ≫ 1/k^{1/4}. A matching information-theoretic lower bound establishes the tightness of this threshold. Empirical evaluations on real-world datasets corroborate the theoretical findings.
📝 Abstract
We study the Nearest Neighbor Search (NNS) problem in a high-dimensional setting where data lies in a low-dimensional subspace and is corrupted by Gaussian noise. Specifically, we consider a semi-random model in which $n$ points from an unknown $k$-dimensional subspace of $\mathbb{R}^d$ ($k \ll d$) are perturbed by zero-mean $d$-dimensional Gaussian noise with variance $σ^2$ per coordinate. Assuming the second-nearest neighbor is at least a factor $(1+\varepsilon)$ farther from the query than the nearest neighbor, and given only the noisy data, our goal is to recover the nearest neighbor in the uncorrupted data. We prove three results. First, for $σ\in O(1/k^{1/4})$, simply performing SVD denoises the data and provably recovers the correct nearest neighbor of the uncorrupted data. Second, for $σ\gg 1/k^{1/4}$, the nearest neighbor in the uncorrupted data is not even identifiable from the noisy data in general, giving a matching lower bound and showing the necessity of this threshold for NNS. Third, for $σ\gg 1/\sqrt{k}$, the noise magnitude $σ\sqrt d$ significantly exceeds inter-point distances in the unperturbed data, and the nearest neighbor in the noisy data generally differs from that in the uncorrupted data. Thus, the first and third results together imply that SVD can identify the correct nearest neighbor even in regimes where naive nearest neighbor search on the noisy data fails. Compared to \citep{abdullah2014spectral}, our result does not require $σ$ to be at least an inverse polynomial in the ambient dimension $d$. Our analysis uses perturbation bounds for singular spaces together with Gaussian concentration and spherical symmetry. We also provide empirical results on real datasets supporting our theory.
Problem

Research questions and friction points this paper is trying to address.

Nearest Neighbor Search
Denoising
High-dimensional Data
Gaussian Noise
Low-dimensional Subspace
Innovation

Methods, ideas, or system contributions that make the work stand out.

SVD denoising
nearest neighbor search
high-dimensional statistics
subspace recovery
spectral perturbation
🔎 Similar Papers
2024-08-30Journal of Information Security and ApplicationsCitations: 0