On the Effectiveness of Random Weights in Graph Neural Networks

📅 2025-01-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses two critical challenges in graph neural networks (GNNs): high training overhead and severe feature rank collapse. To this end, we propose a lightweight GNN paradigm based on fixed random weights—eliminating learnable parameters entirely and instead employing randomly initialized weights alongside a novel random graph propagation operator. Through rigorous theoretical modeling and message-passing analysis, we uncover the intrinsic mechanism by which this design mitigates rank collapse in node representations. Extensive experiments across multiple tasks and benchmark datasets demonstrate that our method achieves predictive performance on par with fully parameterized GNNs, while accelerating training by up to 6× and reducing GPU memory consumption by up to 3×. Moreover, it exhibits strong robustness under various perturbations. This work provides both a theoretically grounded framework and a practical, efficient, and interpretable alternative for GNN design.

Technology Category

Application Category

📝 Abstract
Graph Neural Networks (GNNs) have achieved remarkable success across diverse tasks on graph-structured data, primarily through the use of learned weights in message passing layers. In this paper, we demonstrate that random weights can be surprisingly effective, achieving performance comparable to end-to-end training counterparts, across various tasks and datasets. Specifically, we show that by replacing learnable weights with random weights, GNNs can retain strong predictive power, while significantly reducing training time by up to 6$ imes$ and memory usage by up to 3$ imes$. Moreover, the random weights combined with our construction yield random graph propagation operators, which we show to reduce the problem of feature rank collapse in GNNs. These understandings and empirical results highlight random weights as a lightweight and efficient alternative, offering a compelling perspective on the design and training of GNN architectures.
Problem

Research questions and friction points this paper is trying to address.

Random Weights
Graph Neural Networks (GNNs)
Training Efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Random Weights
Graph Neural Networks (GNNs)
Efficient Training
🔎 Similar Papers
No similar papers found.