Anisotropic local law for non-separable sample covariance matrices

๐Ÿ“… 2026-02-20
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the limitations of existing local law analyses for sample covariance matrices, which are typically confined to separable models and thus ill-suited for capturing the dependence and nonlinear structures prevalent in real-world data. By abandoning the separability assumption, the authors introduce an interpolation-based higher-order cumulant tensor condition and develop a tensor network framework to characterize non-separable correlations. Combining quadratic form concentration, Stieltjes transform techniques, and fluctuation averaging, they establish optimal averaged and anisotropic local laws for non-separable sample covariance matrices down to the spectral scale ฮท โ‰ฅ Nโปยนโบฮต, thereby achieving entrywise control of resolvents in arbitrary directions. The theoretical results are validated in representative non-separable settings, including zero-mean distributions with general dependencies, random feature models, and nonlinearly tilted Gaussian measures.

Technology Category

Application Category

๐Ÿ“ Abstract
We establish local laws for sample covariance matrices $K = N^{-1}\sum_{i=1}^N \g_i\g_i^*$ where the random vectors $\g_1, \ldots, \g_N \in \R^n$ are independent with common covariance $\Sigma$. Previous work has largely focused on the separable model $\g = \Sigma^{1/2}\w$ with $\w$ having independent entries, but this structure is rarely present in statistical applications involving dependent or nonlinearly transformed data. Under a concentration assumption for quadratic forms $\g^*A\g$, we prove an optimal averaged local law showing that the Stieltjes transform of $K$ converges to its deterministic limit uniformly down to the optimal scale $\eta \geq N^{-1+\eps}$. Under an additional structural assumption on the cumulant tensors of $\g$ -- which interpolates between the highly structured case of independent entries and generic dependence -- we establish the full anisotropic local law, providing entrywise control of the resolvent $(K-zI)^{-1}$ in arbitrary directions. We discuss several classes of non-separable examples satisfying our assumptions, including conditionally mean-zero distributions, the random features model $\g = \sigma(X\w)$ arising in machine learning, and Gaussian measures with nonlinear tilting. The proofs introduce a tensor network framework for analyzing fluctuation averaging in the presence of higher-order cumulant structure.
Problem

Research questions and friction points this paper is trying to address.

anisotropic local law
non-separable sample covariance matrices
dependent data
nonlinear transformations
random features model
Innovation

Methods, ideas, or system contributions that make the work stand out.

anisotropic local law
non-separable covariance
tensor network
cumulant structure
resolvent control
๐Ÿ”Ž Similar Papers
No similar papers found.