On Oversquashing in Graph Neural Networks Through the Lens of Dynamical Systems

📅 2024-05-02
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
Long-range information propagation in Graph Neural Networks (GNNs) is fundamentally hindered by the *oversquashing* problem, causing exponential decay of information flow with increasing hop count. To address this, we introduce SWAN—a novel GNN architecture grounded in dynamical systems theory—that jointly enforces global and local non-dissipativity via dual anti-symmetric parameterizations in both the spatial and weight domains. By ensuring energy conservation and nonlinear stability throughout message passing, SWAN theoretically guarantees constant-rate long-range information propagation. Empirically, SWAN significantly alleviates oversquashing on synthetic and real-world long-range interaction benchmarks. It consistently improves performance on node classification and graph-level prediction tasks, demonstrating enhanced modeling of long-range dependencies. Our work establishes non-dissipative dynamics as a fundamental mechanism for boosting the expressive power of GNNs.

Technology Category

Application Category

📝 Abstract
A common problem in Message-Passing Neural Networks is oversquashing -- the limited ability to facilitate effective information flow between distant nodes. Oversquashing is attributed to the exponential decay in information transmission as node distances increase. This paper introduces a novel perspective to address oversquashing, leveraging dynamical systems properties of global and local non-dissipativity, that enable the maintenance of a constant information flow rate. We present SWAN, a uniquely parameterized GNN model with antisymmetry both in space and weight domains, as a means to obtain non-dissipativity. Our theoretical analysis asserts that by implementing these properties, SWAN offers an enhanced ability to transmit information over extended distances. Empirical evaluations on synthetic and real-world benchmarks that emphasize long-range interactions validate the theoretical understanding of SWAN, and its ability to mitigate oversquashing.
Problem

Research questions and friction points this paper is trying to address.

Addresses oversquashing in Graph Neural Networks
Enhances information flow between distant nodes
Introduces SWAN model for non-dissipative information transmission
Innovation

Methods, ideas, or system contributions that make the work stand out.

Leverages dynamical systems for constant information flow
Introduces SWAN with antisymmetry in space and weights
Enhances long-range information transmission in GNNs
🔎 Similar Papers
No similar papers found.