🤖 AI Summary
To address the sensitivity of high-dimensional, low-sample-size (HDLSS) classification to noise, outliers, and distributional shift, this paper proposes a parameter-free, distribution-agnostic robust classifier. The method innovatively couples energy distance with local geometric structure for adaptive distance scaling; integrates data-driven kernel bandwidth selection, weighted energy distance computation, subspace sparsity-aware projection, and a distance-based nearest-neighbor framework. Crucially, it imposes no distributional assumptions—neither Gaussianity nor moment conditions—and significantly enhances discriminative power under small-sample and non-spherical cluster settings. Evaluated on 12 HDLSS benchmark datasets, it achieves an average accuracy improvement of 3.2% over state-of-the-art methods. It maintains ≥89% accuracy under 50% label noise and operates two orders of magnitude faster than deep learning baselines during inference.