Estimating Graph Dimension with Cross-validated Eigenvalues

📅 2021-08-06
🏛️ arXiv.org
📈 Citations: 15
Influential: 2
📄 PDF
🤖 AI Summary
Estimating the latent dimensionality (effective rank) $k$ of graph data is a fundamental challenge in multivariate statistics and network analysis. Existing heuristics—such as the “elbow method”—fail under nonparametric random graph models (e.g., Poisson or Bernoulli edges) due to systematic bias in sample eigenvalues. This paper introduces the first model-agnostic cross-validation framework for $k$: for each sample eigenvector, it conducts an orthogonality hypothesis test against the empirical eigenspace of held-out data, yielding calibrated $p$-values to adaptively identify detectable dimensions. We establish theoretical consistency: under detectability conditions, the estimator converges almost surely to the true $k$, overcoming limitations of ad hoc criteria. Extensive simulations and real-world network analyses demonstrate that our method achieves superior statistical accuracy and computational efficiency compared to classical approaches.
📝 Abstract
In applied multivariate statistics, estimating the number of latent dimensions or the number of clusters is a fundamental and recurring problem. One common diagnostic is the scree plot, which shows the largest eigenvalues of the data matrix in decreasing order; the user searches for a “gap” or “elbow” in the decaying eigenvalues; unfortunately, these patterns can hide beneath the bias of the sample eigenvalues. This methodological problem is conceptually difficult because, in many situations, there is only enough signal to detect a subset of the k population dimensions/eigenvectors. In this situation, one could argue that the correct choice of k is the number of detectable dimensions. We alleviate these problems with cross-validated eigenvalues. Under a large class of random graph models, without any parametric assumptions, we provide a p-value for each sample eigenvector. It tests the null hypothesis that this sample eigenvector is orthogonal to (i.e., uncorrelated with) the true latent dimensions. This approach naturally adapts to problems where some dimensions are not statistically detectable. In scenarios where all k dimensions can be estimated, we prove that our procedure consistently estimates k. In simulations and a data example, the proposed estimator compares favorably to alternative approaches in both computational and statistical performance.
Problem

Research questions and friction points this paper is trying to address.

Estimating latent dimensions in random graph models
Testing sample eigenvectors' correlation with true dimensions
Consistently determining the number of clusters k
Innovation

Methods, ideas, or system contributions that make the work stand out.

Cross-validated eigenvalues estimate graph dimensions
p-values test eigenvector orthogonality to latent dimensions
Procedure adapts to statistically undetectable dimensions consistently
🔎 Similar Papers
No similar papers found.
F
Fan Chen
Department of Statistics, University of Wisconsin–Madison
S
S. Roch
Department of Mathematics, University of Wisconsin–Madison
Karl Rohe
Karl Rohe
Professor of Statistics, UW Madison
Statistics
S
Shuqi Yu
Department of Mathematics, University of Wisconsin–Madison