🤖 AI Summary
This work investigates the spectral convergence of the graph Laplacian—constructed from random samples on a manifold—to the continuous Laplace–Beltrami operator in high-dimensional spaces, with central focus on the optimal choice of the Gaussian kernel bandwidth ε. We propose a unified analytical framework based on manifold heat kernel interpolation, marking the first integration of heat kernel interpolation techniques into spectral convergence analysis of graph Laplacians and overcoming the strong ε-dependence inherent in conventional approaches. We rigorously establish that, when ε ≍ N⁻¹⁄⁽ᵈ⁺²⁾, the eigenvalues and appropriately normalized eigenfunctions of the graph Laplacian converge uniformly to those of the manifold Laplacian at rate O(ε¹⁄²). This result provides foundational theoretical guarantees for graph neural networks and geometric deep learning.