Unlocking the Potential of Classic GNNs for Graph-level Tasks: Simple Architectures Meet Excellence

📅 2025-02-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
The prevailing assumption that Graph Transformers (GTs) inherently outperform Graph Neural Networks (GNNs) is challenged, as GNNs’ performance on graph-level tasks has been underestimated due to limited expressive power, over-smoothing, over-squashing, and inadequate modeling of long-range dependencies. Method: We propose GNN+, a lightweight, general-purpose enhancement framework that systematically integrates six orthogonal techniques—edge feature incorporation, normalization, Dropout, residual connections, feed-forward networks, and positional encoding—into canonical GNNs (e.g., GCN, GIN, GatedGCN). Contribution/Results: Evaluated on 14 standard graph classification and regression benchmarks, GNN+ ranks among the top three on all datasets—achieving state-of-the-art (SOTA) performance on eight—and significantly outperforms GTs in inference speed. Our results demonstrate that carefully enhanced, architecturally simple GNNs can comprehensively surpass complex GTs, establishing a new paradigm for graph representation learning that is efficient, robust, and interpretable.

Technology Category

Application Category

📝 Abstract
Message-passing Graph Neural Networks (GNNs) are often criticized for their limited expressiveness, issues like over-smoothing and over-squashing, and challenges in capturing long-range dependencies, while Graph Transformers (GTs) are considered superior due to their global attention mechanisms. Literature frequently suggests that GTs outperform GNNs, particularly in graph-level tasks such as graph classification and regression. In this study, we explore the untapped potential of GNNs through an enhanced framework, GNN+, which integrates six widely used techniques: edge feature integration, normalization, dropout, residual connections, feed-forward networks, and positional encoding, to effectively tackle graph-level tasks. We conduct a systematic evaluation of three classic GNNs, namely GCN, GIN, and GatedGCN, enhanced by the GNN+ framework across 14 well-known graph-level datasets. Our results show that, contrary to the prevailing belief, classic GNNs excel in graph-level tasks, securing top three rankings across all datasets and achieving first place in eight, while also demonstrating greater efficiency than GTs. This highlights the potential of simple GNN architectures, challenging the belief that complex mechanisms in GTs are essential for superior graph-level performance.
Problem

Research questions and friction points this paper is trying to address.

Enhancing classic GNNs for graph-level tasks
Addressing limitations like over-smoothing in GNNs
Comparing GNN performance with Graph Transformers
Innovation

Methods, ideas, or system contributions that make the work stand out.

Enhanced GNN framework
Integrates six techniques
Outperforms Graph Transformers
🔎 Similar Papers
No similar papers found.
Yuankai Luo
Yuankai Luo
Beihang University
Graph Neural NetworksGenerative Models
L
Lei Shi
Beihang University
X
Xiao-Ming Wu
The Hong Kong Polytechnic University