🤖 AI Summary
This paper investigates the Γ-convergence of graph Dirichlet energy and spectral convergence of graph Laplacians on multidimensional hybrid manifolds—unions of manifolds with distinct intrinsic dimensions—to elucidate how machine learning methods adapt to variable-dimensional data structures. Theoretically, we show that the unnormalized graph Laplacian only captures the highest-dimensional component, whereas the normalized graph Dirichlet energy adaptively responds to all dimensional constituents—establishing, for the first time from a convergence perspective, its intrinsic robustness to mixed-dimensionality. By integrating Γ-convergence analysis, spectral graph theory, nonlocal operator approximation, and manifold learning, we develop a unified convergence framework for the normalized energy on multidimensional manifolds and rigorously prove its uniform spectral convergence. Numerical experiments corroborate the theoretical predictions.
📝 Abstract
We study $Γ$-convergence of graph Dirichlet energies and spectral convergence of graph Laplacians on unions of intersecting manifolds of potentially different dimensions. Our investigation is motivated by problems of machine learning, as real-world data often consist of parts or classes with different intrinsic dimensions. An important challenge is to understand which machine learning methods adapt to such varied dimensionalities. We investigate the standard unnormalized and the normalized graph Dirichlet energies. We show that the unnormalized energy and its associated graph Laplacian asymptotically only sees the variations within the manifold of the highest dimension. On the other hand, we prove that the normalized Dirichlet energy converges to a (tensorized) Dirichlet energy on the union of manifolds that adapts to all dimensions simultaneously. We also establish the related spectral convergence and present a few numerical experiments to illustrate our findings.