Cluster Attention for Graph Machine Learning

πŸ“… 2026-04-08
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the limited receptive field of existing graph neural networks and the lack of structural inductive bias in global attention mechanisms. To overcome these limitations, the authors propose Cluster Attention (CLATT), a novel mechanism that integrates graph community detection with attention for the first time. Specifically, nodes are partitioned into clusters via community detection, and attention-based interactions are performed within each cluster. This design effectively expands the model’s receptive field while preserving essential topological information from the graph structure. By unifying message-passing and Graph Transformer architectures, CLATT achieves substantial performance gains on multiple real-world graph benchmarks, including GraphLand, demonstrating a strong balance between expressive power and structural awareness.
πŸ“ Abstract
Message Passing Neural Networks have recently become the most popular approach to graph machine learning tasks; however, their receptive field is limited by the number of message passing layers. To increase the receptive field, Graph Transformers with global attention have been proposed; however, global attention does not take into account the graph topology and thus lacks graph-structure-based inductive biases, which are typically very important for graph machine learning tasks. In this work, we propose an alternative approach: cluster attention (CLATT). We divide graph nodes into clusters with off-the-shelf graph community detection algorithms and let each node attend to all other nodes in each cluster. CLATT provides large receptive fields while still having strong graph-structure-based inductive biases. We show that augmenting Message Passing Neural Networks or Graph Transformers with CLATT significantly improves their performance on a wide range of graph datasets including datasets from the recently introduced GraphLand benchmark representing real-world applications of graph machine learning.
Problem

Research questions and friction points this paper is trying to address.

graph machine learning
receptive field
graph topology
inductive bias
attention mechanism
Innovation

Methods, ideas, or system contributions that make the work stand out.

Cluster Attention
Graph Machine Learning
Inductive Bias
Message Passing Neural Networks
Graph Transformers
πŸ”Ž Similar Papers
No similar papers found.