Position: Don't be Afraid of Over-Smoothing And Over-Squashing

📅 2026-01-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study challenges the prevailing view that over-smoothing and over-squashing are the primary bottlenecks limiting graph neural network (GNN) performance. Through systematic evaluation of GNNs of varying depths and mitigation strategies across multiple benchmark datasets, complemented by statistical analysis of information distribution, the authors demonstrate that performance degradation is largely unrelated to over-smoothing. Moreover, architectural modifications designed to alleviate over-squashing yield negligible improvements. The findings underscore that the receptive field often lacks task-relevant label information, suggesting that GNN behavior should be understood through a task- and data-driven lens rather than an indiscriminate pursuit of deeper architectures or long-range interactions. In practice, optimal model depth remains relatively shallow.

Technology Category

Application Category

📝 Abstract
Over-smoothing and over-squashing have been extensively studied in the literature on Graph Neural Networks (GNNs) over the past years. We challenge this prevailing focus in GNN research, arguing that these phenomena are less critical for practical applications than assumed. We suggest that performance decreases often stem from uninformative receptive fields rather than over-smoothing. We support this position with extensive experiments on several standard benchmark datasets, demonstrating that accuracy and over-smoothing are mostly uncorrelated and that optimal model depths remain small even with mitigation techniques, thus highlighting the negligible role of over-smoothing. Similarly, we challenge that over-squashing is always detrimental in practical applications. Instead, we posit that the distribution of relevant information over the graph frequently factorises and is often localised within a small k-hop neighbourhood, questioning the necessity of jointly observing entire receptive fields or engaging in an extensive search for long-range interactions. The results of our experiments show that architectural interventions designed to mitigate over-squashing fail to yield significant performance gains. This position paper advocates for a paradigm shift in theoretical research, urging a diligent analysis of learning tasks and datasets using statistics that measure the underlying distribution of label-relevant information to better understand their localisation and factorisation.
Problem

Research questions and friction points this paper is trying to address.

over-smoothing
over-squashing
Graph Neural Networks
receptive field
information localization
Innovation

Methods, ideas, or system contributions that make the work stand out.

over-smoothing
over-squashing
graph neural networks
receptive field informativeness
information localization
🔎 Similar Papers
No similar papers found.