🤖 AI Summary
We address binary classification under high-dimensional sparse mean differences and spiked covariance structures—where the covariance matrix exhibits a few large eigenvalues and many small, densely packed ones. We propose an adaptive framework comprising whitening via the sample covariance’s inverse square root, followed by sparse feature screening based on the top-$s$ coordinates of the whitened mean-difference vector, and finally Fisher’s linear discriminant in the reduced space. Our theoretical contribution is the first incorporation of entrywise matrix perturbation bounds for spiked covariance models into classification analysis, enabling attainment of the Bayes-optimal rate asymptotically (as $n o infty$ with $ssqrt{(log p)/n} o 0$) without requiring prior knowledge of the sparsity level $s$. Empirically, our method achieves accuracy competitive with state-of-the-art approaches while substantially improving feature selection compactness.
📝 Abstract
We study the classification problem for high-dimensional data with $n$ observations on $p$ features where the $p imes p$ covariance matrix $Sigma$ exhibits a spiked eigenvalues structure and the vector $zeta$, given by the difference between the whitened mean vectors, is sparse with sparsity at most $s$. We propose an adaptive classifier (adaptive with respect to the sparsity $s$) that first performs dimension reduction on the feature vectors prior to classification in the dimensionally reduced space, i.e., the classifier whitened the data, then screen the features by keeping only those corresponding to the $s$ largest coordinates of $zeta$ and finally apply Fisher linear discriminant on the selected features. Leveraging recent results on entrywise matrix perturbation bounds for covariance matrices, we show that the resulting classifier is Bayes optimal whenever $n
ightarrow infty$ and $s sqrt{n^{-1} ln p}
ightarrow 0$. Experimental results on real and synthetic data sets indicate that the proposed classifier is competitive with existing state-of-the-art methods while also selecting a smaller number of features.