🤖 AI Summary
This work addresses the limitations of graph neural networks in long-range information propagation, where over-smoothing and over-squashing often hinder performance. Existing approaches typically rely on high computational overhead or structural modifications to the input graph. To overcome these issues, this study introduces Bakry–Émery geometry into graph learning for the first time, proposing a learnable Bakry–Émery graph Laplacian that integrates diffusion and convection mechanisms. By adaptively modulating node-wise potential fields according to the task at hand, the operator controls spectral properties without altering the underlying graph topology. This Laplacian serves as a plug-and-play replacement for the standard graph Laplacian and is embedded within a Chebyshev spectral filtering framework to construct mu-ChebNet. Experiments demonstrate consistent performance gains on both synthetic long-range reasoning tasks and real-world benchmark datasets, while also offering an interpretable mechanism for information routing.
📝 Abstract
Graph Neural Networks (GNNs) often struggle to propagate information across long distances due to oversmoothing and oversquashing. Existing remedies such as graph transformers or rewiring typically incur high computational cost or require altering the graph structure. We introduce a Bakry-Emery graph Laplacian that integrates diffusion and advection through a learnable node-wise potential, inducing task-dependent propagation dynamics without modifying topology. This operator has a well-behaved spectral decomposition and acts as a drop-in replacement for standard Laplacians in spectral GNNs. Building on this insight, we develop mu-ChebNet, a spectral architecture that jointly learns the potential and Chebyshev filters, effectively bridging message-passing adaptivity and spectral efficiency. Our theoretical analysis shows how the potential modulates the spectrum, enabling control of key graph properties. Empirically, mu-ChebNet delivers consistent gains on synthetic long-range reasoning tasks, as well as real-world benchmarks, while offering an interpretable routing field that reveals how information flows through the graph. This establishes the Bakry-Emery Laplacian as a principled and efficient foundation for adaptive spectral graph learning.