🤖 AI Summary
This work addresses the challenge in k-nearest neighbors (k-NN) classification where inference efficiency and accuracy are difficult to balance at scale, and fixed k values lack adaptability. To overcome these limitations, the authors propose a training-phase-optimized adaptive graph model that integrates Hierarchical Navigable Small World (HNSW) graphs with a precomputed weighted voting mechanism. During training, the method constructs a hierarchical graph structure and determines for each node an optimal neighbor set along with corresponding weights, enabling real-time classification at inference without dynamic neighbor search. Experiments on six benchmark datasets demonstrate that the proposed approach significantly outperforms eight state-of-the-art baselines, achieving substantial speedups in inference while maintaining or even improving classification accuracy—effectively reconciling high precision with real-time performance.
📝 Abstract
The k-nearest neighbors (kNN) algorithm is a cornerstone of non-parametric classification in artificial intelligence, yet its deployment in large-scale applications is persistently constrained by the computational trade-off between inference speed and accuracy. Existing approximate nearest neighbor solutions accelerate retrieval but often degrade classification precision and lack adaptability in selecting the optimal neighborhood size (k). Here, we present an adaptive graph model that decouples inference latency from computational complexity. By integrating a Hierarchical Navigable Small World (HNSW) graph with a pre-computed voting mechanism, our framework completely transfers the computational burden of neighbor selection and weighting to the training phase. Within this topological structure, higher graph layers enable rapid navigation, while lower layers encode precise, node-specific decision boundaries with adaptive neighbor counts. Benchmarking against eight state-of-the-art baselines across six diverse datasets, we demonstrate that this architecture significantly accelerates inference speeds, achieving real-time performance, without compromising classification accuracy. These findings offer a scalable, robust solution to the long-standing inference bottleneck of kNN, establishing a new structural paradigm for graph-based nonparametric learning.