Beyond the noise: intrinsic dimension estimation with optimal neighbourhood identification

📅 2024-05-24
🏛️ arXiv.org
📈 Citations: 3
Influential: 0
📄 PDF
🤖 AI Summary
Estimating intrinsic dimensionality (ID) from real-world data is highly sensitive to neighborhood scale: small scales overestimate ID due to noise, while large scales introduce bias from manifold curvature and topology. This work proposes a self-consistent scale selection protocol that identifies the optimal “sweet spot” for ID estimation by enforcing local density constancy. Our key contribution is the first formal coupling of ID estimation and scale selection, resolved via iterative optimization that yields a theoretically guaranteed robust decoupling—effectively suppressing both noise and curvature effects. The method integrates local neighborhood graph construction, asymptotic statistical analysis, and rigorous error-bound derivation. Evaluated on diverse synthetic and real-world datasets, it reduces ID estimation error by over 30% compared to state-of-the-art methods, while significantly improving stability and noise robustness.

Technology Category

Application Category

📝 Abstract
The Intrinsic Dimension (ID) is a key concept in unsupervised learning and feature selection, as it is a lower bound to the number of variables which are necessary to describe a system. However, in almost any real-world dataset the ID depends on the scale at which the data are analysed. Quite typically at a small scale, the ID is very large, as the data are affected by measurement errors. At large scale, the ID can also be erroneously large, due to the curvature and the topology of the manifold containing the data. In this work, we introduce an automatic protocol to select the sweet spot, namely the correct range of scales in which the ID is meaningful and useful. This protocol is based on imposing that for distances smaller than the correct scale the density of the data is constant. In the presented framework, to estimate the density it is necessary to know the ID, therefore, this condition is imposed self-consistently. We derive theoretical guarantees and illustrate the usefulness and robustness of this procedure by benchmarks on artificial and real-world datasets.
Problem

Research questions and friction points this paper is trying to address.

Estimating intrinsic dimension by identifying optimal neighborhood scales
Addressing scale-dependent ID variations due to noise and curvature
Developing automatic protocol for meaningful scale selection in datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Automatic protocol identifies optimal scale range
Self-consistent density condition determines intrinsic dimension
Framework validated on artificial and real datasets
🔎 Similar Papers
No similar papers found.