Reservoir-Based Graph Convolutional Networks

📅 2026-03-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of conventional graph convolutional networks in modeling long-range dependencies in complex or dynamic graphs, which are often hindered by oversmoothing and high computational costs. The authors propose RGC-Net, a novel framework that uniquely integrates reservoir computing with structured graph convolution, leveraging fixed random weights and leaky integrators to enable efficient and stable multi-hop information propagation. This approach effectively mitigates oversmoothing while preserving node-specific features. The method is further extended to graph generation tasks and incorporated into a Transformer architecture based on RGC-Net. Experimental results demonstrate state-of-the-art performance in both graph classification and dynamic brain connectome generation, with faster convergence and strong capability in modeling the evolution of brain graphs.

Technology Category

Application Category

📝 Abstract
Message passing is a core mechanism in Graph Neural Networks (GNNs), enabling the iterative update of node embeddings by aggregating information from neighboring nodes. Graph Convolutional Networks (GCNs) exemplify this approach by adapting convolutional operations for graph structures, allowing features from adjacent nodes to be combined effectively. However, GCNs encounter challenges with complex or dynamic data. Capturing long-range dependencies often requires deeper layers, which not only increase computational costs but also lead to over-smoothing, where node embeddings become indistinguishable. To overcome these challenges, reservoir computing has been integrated into GNNs, leveraging iterative message-passing dynamics for stable information propagation without extensive parameter tuning. Despite its promise, existing reservoir-based models lack structured convolutional mechanisms, limiting their ability to accurately aggregate multi-hop neighborhood information. To address these limitations, we propose RGC-Net (Reservoir-based Graph Convolutional Network), which integrates reservoir dynamics with structured graph convolution. Key contributions include: (i) a reimagined convolutional framework with fixed random reservoir weights and a leaky integrator to enhance feature retention; (ii) a robust, adaptable model for graph classification; and (iii) an RGC-Net-powered transformer for graph generation with application to dynamic brain connectivity. Extensive experiments show that RGC-Net achieves state-of-the-art performance in classification and generative tasks, including brain graph evolution, with faster convergence and reduced over-smoothing. Source code is available at https://github.com/basiralab/RGC-Net .
Problem

Research questions and friction points this paper is trying to address.

Graph Neural Networks
Reservoir Computing
Graph Convolution
Over-smoothing
Long-range Dependencies
Innovation

Methods, ideas, or system contributions that make the work stand out.

Reservoir Computing
Graph Convolutional Networks
Over-smoothing Mitigation
Leaky Integrator
Graph Generation
🔎 Similar Papers
No similar papers found.
M
Mayssa Soussia
National Engineering School of Sousse, University of Sousse, LATIS – Laboratory of Advanced Technology and Intelligent Systems, 4023, Sousse, Tunisia
G
Gita Ayu Salsabila
BASIRA Lab, Imperial-X and Department of Computing, Imperial College London, London, UK
Mohamed Ali Mahjoub
Mohamed Ali Mahjoub
Professor at ENISo : University of Sousse - Tunisia
Computer visionPredictive intelligence
Islem Rekik
Islem Rekik
BASIRA lab
Machine and Deep LearningNeuroimagingNetwork NeurociencePredictive Intelligence in MedicineConnectomics