Asymptotic Theory of Eigenvectors for Latent Embeddings with Generalized Laplacian Matrices

📅 2025-03-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the lack of asymptotic theory for latent embedding eigenvectors of generalized Laplacian matrices—particularly when nodes exhibit dependence, rendering classical random matrix theory (RMT) inapplicable. We propose the first high-order asymptotic theory for spiked eigenvectors and eigenvalues under dependence (ATE-GL). Methodologically, we establish the first higher-order eigenvector asymptotic expansion and asymptotic normality for generalized regularized Laplacian matrices, integrating generalized quadratic vector equations, local laws, and dependence-aware RMT tools. Our theory relaxes the stringent independence assumption of classical RMT, enabling precise quantification of embedding uncertainty. Numerical experiments demonstrate high accuracy and robustness of ATE-GL in graph embedding and manifold learning tasks.

Technology Category

Application Category

📝 Abstract
Laplacian matrices are commonly employed in many real applications, encoding the underlying latent structural information such as graphs and manifolds. The use of the normalization terms naturally gives rise to random matrices with dependency. It is well-known that dependency is a major bottleneck of new random matrix theory (RMT) developments. To this end, in this paper, we formally introduce a class of generalized (and regularized) Laplacian matrices, which contains the Laplacian matrix and the random adjacency matrix as a specific case, and suggest the new framework of the asymptotic theory of eigenvectors for latent embeddings with generalized Laplacian matrices (ATE-GL). Our new theory is empowered by the tool of generalized quadratic vector equation for dealing with RMT under dependency, and delicate high-order asymptotic expansions of the empirical spiked eigenvectors and eigenvalues based on local laws. The asymptotic normalities established for both spiked eigenvectors and eigenvalues will enable us to conduct precise inference and uncertainty quantification for applications involving the generalized Laplacian matrices with flexibility. We discuss some applications of the suggested ATE-GL framework and showcase its validity through some numerical examples.
Problem

Research questions and friction points this paper is trying to address.

Develops asymptotic theory for eigenvectors in latent embeddings using generalized Laplacian matrices.
Addresses dependency issues in random matrix theory with new framework ATE-GL.
Enables precise inference and uncertainty quantification for applications involving generalized Laplacians.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generalized Laplacian matrices for latent embeddings
Asymptotic theory using quadratic vector equations
High-order expansions for eigenvectors and eigenvalues
🔎 Similar Papers
No similar papers found.
J
Jianqing Fan
Department of Operations Research and Financial Engineering, Princeton University
Y
Yingying Fan
Data Sciences and Operations Department, Marshall School of Business, University of Southern California
Jinchi Lv
Jinchi Lv
Kenneth King Stonier Chair in Business Administration
AI for Business and ApplicationsStatistics and Data ScienceMachine Learning
F
Fan Yang
Yau Mathematical Sciences Center, Tsinghua University
D
Diwen Yu
Yau Mathematical Sciences Center, Tsinghua University