Exploring and Improving Initialization for Deep Graph Neural Networks: A Signal Propagation Perspective

📅 2025-06-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Deep graph neural networks (GNNs) often suffer from signal degradation, leading to performance deterioration with increasing depth. Method: Motivated by signal propagation analysis, we propose the Graph Embedding Variation (GEV) metric—the first quantitative measure to expose the inherent limitation of existing initialization schemes in jointly regulating forward/backward propagation and GEV. Based on this insight, we introduce SPoGInit, a systematic initialization framework integrating signal propagation theory, variance-aware weight optimization, GCN-specific design, and graph-structure-aware adaptation. Contribution/Results: Extensive experiments demonstrate that SPoGInit significantly enhances training stability and convergence accuracy for deep GNNs. It consistently outperforms standard baselines—including Xavier and He initializations—across diverse tasks and architectures (e.g., GCN, GAT, GIN), enabling sustained performance gains as network depth increases.

Technology Category

Application Category

📝 Abstract
Graph Neural Networks (GNNs) often suffer from performance degradation as the network depth increases. This paper addresses this issue by introducing initialization methods that enhance signal propagation (SP) within GNNs. We propose three key metrics for effective SP in GNNs: forward propagation, backward propagation, and graph embedding variation (GEV). While the first two metrics derive from classical SP theory, the third is specifically designed for GNNs. We theoretically demonstrate that a broad range of commonly used initialization methods for GNNs, which exhibit performance degradation with increasing depth, fail to control these three metrics simultaneously. To deal with this limitation, a direct exploitation of the SP analysis--searching for weight initialization variances that optimize the three metrics--is shown to significantly enhance the SP in deep GCNs. This approach is called Signal Propagation on Graph-guided Initialization (SPoGInit). Our experiments demonstrate that SPoGInit outperforms commonly used initialization methods on various tasks and architectures. Notably, SPoGInit enables performance improvements as GNNs deepen, which represents a significant advancement in addressing depth-related challenges and highlights the validity and effectiveness of the SP analysis framework.
Problem

Research questions and friction points this paper is trying to address.

Addressing performance degradation in deep Graph Neural Networks
Improving signal propagation via specialized initialization methods
Controlling forward, backward propagation and graph embedding variation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Introduces SPoGInit for deep GNN initialization
Optimizes forward, backward propagation, and GEV metrics
Enhances signal propagation in deep graph networks
🔎 Similar Papers
No similar papers found.
S
Senmiao Wang
The Chinese University of Hong Kong, Shenzhen, China
Y
Yupeng Chen
The Chinese University of Hong Kong, Shenzhen, China
Yushun Zhang
Yushun Zhang
The Chinese University of Hong Kong, Shenzhen, China
OptimizationDeep learning
R
Ruoyu Sun
The Chinese University of Hong Kong, Shenzhen, China; Shenzhen International Center for Industrial and Applied Mathematics; Shenzhen Research Institute of Big Data
Tian Ding
Tian Ding
Shenzhen Research Institute of Big Data