🤖 AI Summary
Existing graph neural networks (GNNs) lack a spectrally consistent quantitative metric for oversmoothing. Method: We systematically compare the theoretical properties of Dirichlet energy induced by unnormalized versus normalized graph Laplacians as oversmoothing measures, analyzing their adherence to Rusch et al.’s node-similarity axioms—particularly scale invariance and local consistency—and establishing a formal spectral framework to characterize their distinct eigenvalue decay patterns and low-frequency energy concentration behaviors. Results: We prove that the normalized Dirichlet energy violates key axioms, whereas the unnormalized variant satisfies all. This work identifies, for the first time, the axiom-inconsistency of the normalized variant and establishes the unnormalized Dirichlet energy as a theoretically grounded, spectrally compatible metric for monitoring GNN oversmoothing—providing a rigorous, interpretable basis for tracking training dynamics.
📝 Abstract
We analyze the distinctions between two functionals often used as over-smoothing measures: the Dirichlet energies induced by the unnormalized graph Laplacian and the normalized graph Laplacian. We demonstrate that the latter fails to satisfy the axiomatic definition of a node-similarity measure proposed by Rusch extit{et al.} By formalizing fundamental spectral properties of these two definitions, we highlight critical distinctions necessary to select the metric that is spectrally compatible with the GNN architecture, thereby resolving ambiguities in monitoring the dynamics.