🤖 AI Summary
High-dimensional vector approximate nearest neighbor search (ANNS) suffers from efficiency bottlenecks due to linear growth of distance computation cost with dimensionality—especially acute for LLM-derived semantic vectors. This work systematically evaluates six dimensionality reduction (DR) techniques—PCA, product quantization, autoencoders, contrastive learning-based DR, LSH variants, and random projection—quantifying their acceleration effects on mainstream ANNS engines (e.g., FAISS, Annoy) under a unified experimental framework. We propose two analyzable DR–search co-design architectures and theoretically derive critical pruning gain thresholds, characterizing the fundamental trade-off between dimensionality compression and retrieval accuracy degradation. Experiments across six public benchmarks show that deep DR methods achieve 1.8–3.5× speedup while maintaining >90% recall. Furthermore, we provide a data-aware guideline for optimal DR technique selection based on intrinsic data properties.
📝 Abstract
Approximate Nearest Neighbor Search (ANNS) on high-dimensional vectors has become a fundamental and essential component in various machine learning tasks. Recently, with the rapid development of deep learning models and the applications of Large Language Models (LLMs), the dimensionality of the vectors keeps growing in order to accommodate a richer semantic representation. This poses a major challenge to the ANNS solutions since distance calculation cost in ANNS grows linearly with the dimensionality of vectors. To overcome this challenge, dimensionality-reduction techniques can be leveraged to accelerate the distance calculation in the search process. In this paper, we investigate six dimensionality-reduction techniques that have the potential to improve ANNS solutions, including classical algorithms such as PCA and vector quantization, as well as algorithms based on deep learning approaches. We further describe two frameworks to apply these techniques in the ANNS workflow, and theoretically analyze the time and space costs, as well as the beneficial threshold for the pruning ratio of these techniques. The surveyed techniques are evaluated on six public datasets. The analysis of the results reveals the characteristics of the different families of techniques and provides insights into the promising future research directions.