π€ AI Summary
To address the instability of tangent space estimation on data manifolds under high noise and the sensitivity of conventional Local Principal Component Analysis (LPCA) to neighborhood scale selection, this paper proposes a robust tangent space estimation method based on gradient orthogonalization of low-frequency eigenvectors of the graph Laplacian. By leveraging global spectral structure to guide local geometric inference, the method theoretically guarantees that the subspace spanned by gradients of low-frequency eigenfunctions converges uniformly to the true tangent bundleβeven under Gaussian noise and outlier corruption. Technically, it integrates graph spectral theory, differential geometry, and random matrix analysis. Experiments demonstrate significant improvements over LPCA across downstream tasks including manifold learning, boundary detection, and local intrinsic dimension estimation, achieving both strong theoretical guarantees and superior robustness.
π Abstract
Estimating the tangent spaces of a data manifold is a fundamental problem in data analysis. The standard approach, Local Principal Component Analysis (LPCA), struggles in high-noise settings due to a critical trade-off in choosing the neighborhood size. Selecting an optimal size requires prior knowledge of the geometric and noise characteristics of the data that are often unavailable. In this paper, we propose a spectral method, Laplacian Eigenvector Gradient Orthogonalization (LEGO), that utilizes the global structure of the data to guide local tangent space estimation. Instead of relying solely on local neighborhoods, LEGO estimates the tangent space at each data point by orthogonalizing the gradients of low-frequency eigenvectors of the graph Laplacian. We provide two theoretical justifications of our method. First, a differential geometric analysis on a tubular neighborhood of a manifold shows that gradients of the low-frequency Laplacian eigenfunctions of the tube align closely with the manifold's tangent bundle, while an eigenfunction with high gradient in directions orthogonal to the manifold lie deeper in the spectrum. Second, a random matrix theoretic analysis also demonstrates that low-frequency eigenvectors are robust to sub-Gaussian noise. Through comprehensive experiments, we demonstrate that LEGO yields tangent space estimates that are significantly more robust to noise than those from LPCA, resulting in marked improvements in downstream tasks such as manifold learning, boundary detection, and local intrinsic dimension estimation.