🤖 AI Summary
Graph Neural Networks (GNNs) are prone to oversquashing due to channel bottlenecks that impede long-range information propagation, a limitation particularly pronounced in dense or heterophilic graphs. This work proposes a novel approach that, for the first time, integrates cross-attention mechanisms with cohesive subgraph representations to dynamically construct structure-aware subgraph embeddings during message passing. By doing so, the method effectively enriches node representations with global contextual information while suppressing spurious connections. Crucially, it alleviates oversquashing without exacerbating channel bottlenecks. Extensive experiments demonstrate consistent performance gains over existing baselines across multiple benchmark datasets, yielding notably higher classification accuracy.
📝 Abstract
Graph neural networks (GNNs) have achieved strong performance across various real-world domains. Nevertheless, they suffer from oversquashing, where long-range information is distorted as it is compressed through limited message-passing pathways. This bottleneck limits their ability to capture essential global context and decreases their performance, particularly in dense and heterophilic regions of graphs. To address this issue, we propose a novel graph learning framework that enriches node embeddings via cross-attentive cohesive subgraph representations to mitigate the impact of excessive long-range dependencies. This framework enhances the node representation by emphasizing cohesive structure in long-range information but removing noisy or irrelevant connections. It preserves essential global context without overloading the narrow bottlenecked channels, which further mitigates oversquashing. Extensive experiments on multiple benchmark datasets demonstrate that our model achieves consistent improvements in classification accuracy over standard baseline methods.