Advection-Diffusion on Graphs: A Bakry-Emery Laplacian for Spectral Graph Neural Networks

📅 2026-02-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of graph neural networks in long-range information propagation, where over-smoothing and over-squashing often hinder performance. Existing approaches typically rely on high computational overhead or structural modifications to the input graph. To overcome these issues, this study introduces Bakry–Émery geometry into graph learning for the first time, proposing a learnable Bakry–Émery graph Laplacian that integrates diffusion and convection mechanisms. By adaptively modulating node-wise potential fields according to the task at hand, the operator controls spectral properties without altering the underlying graph topology. This Laplacian serves as a plug-and-play replacement for the standard graph Laplacian and is embedded within a Chebyshev spectral filtering framework to construct mu-ChebNet. Experiments demonstrate consistent performance gains on both synthetic long-range reasoning tasks and real-world benchmark datasets, while also offering an interpretable mechanism for information routing.

Technology Category

Application Category

📝 Abstract
Graph Neural Networks (GNNs) often struggle to propagate information across long distances due to oversmoothing and oversquashing. Existing remedies such as graph transformers or rewiring typically incur high computational cost or require altering the graph structure. We introduce a Bakry-Emery graph Laplacian that integrates diffusion and advection through a learnable node-wise potential, inducing task-dependent propagation dynamics without modifying topology. This operator has a well-behaved spectral decomposition and acts as a drop-in replacement for standard Laplacians in spectral GNNs. Building on this insight, we develop mu-ChebNet, a spectral architecture that jointly learns the potential and Chebyshev filters, effectively bridging message-passing adaptivity and spectral efficiency. Our theoretical analysis shows how the potential modulates the spectrum, enabling control of key graph properties. Empirically, mu-ChebNet delivers consistent gains on synthetic long-range reasoning tasks, as well as real-world benchmarks, while offering an interpretable routing field that reveals how information flows through the graph. This establishes the Bakry-Emery Laplacian as a principled and efficient foundation for adaptive spectral graph learning.
Problem

Research questions and friction points this paper is trying to address.

oversmoothing
oversquashing
Graph Neural Networks
long-range propagation
spectral GNNs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bakry-Emery Laplacian
spectral graph neural networks
advection-diffusion
learnable potential
mu-ChebNet
🔎 Similar Papers
2021-12-14IEEE Transactions on Neural Networks and Learning SystemsCitations: 25
P
Pierre-Gabriel Berlureau
École Normale Supérieure – PSL, Paris, France
Ali Hariri
Ali Hariri
Senior Researcher at Huawei
Access ControlUsage ControlIoT SecurityNetwork SecurityData Spaces
V
Victor Kawasaki-Borruat
École polytechnique fédérale de Lausanne (EPFL), Lausanne, Switzerland
M
Mia Zosso
École polytechnique fédérale de Lausanne (EPFL), Lausanne, Switzerland
Pierre Vandergheynst
Pierre Vandergheynst
Professor of Electrical Engineering, Ecole Polytechnique Fédérale de Lausanne (EPFL)
data sciencemachine learningartificial intelligencenetwork sciencecomputer vision