Dual Mamba for Node-Specific Representation Learning: Tackling Over-Smoothing with Selective State Space Modeling

📅 2025-11-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Deep graph neural networks (GNNs) suffer from over-smoothing due to repeated message passing across layers, degrading node discriminability. Existing remedies—e.g., residual or skip connections—lack explicit modeling of node-specific, progressive representation evolution and neglect global structural context. To address this, we propose Dual Mamba-GCN: the first GNN framework integrating the Mamba architecture. It comprises two complementary pathways: (i) a local state evolution module leveraging selective state-space modeling for adaptive, layer-wise node representation refinement; and (ii) a global context-aware module combining global attention with graph-topology-guided long-range dependency modeling. Extensive experiments on multiple benchmark datasets demonstrate that Dual Mamba-GCN significantly mitigates over-smoothing, enhances expressive power and generalization of deep GNNs, and maintains computational efficiency.

Technology Category

Application Category

📝 Abstract
Over-smoothing remains a fundamental challenge in deep Graph Neural Networks (GNNs), where repeated message passing causes node representations to become indistinguishable. While existing solutions, such as residual connections and skip layers, alleviate this issue to some extent, they fail to explicitly model how node representations evolve in a node-specific and progressive manner across layers. Moreover, these methods do not take global information into account, which is also crucial for mitigating the over-smoothing problem. To address the aforementioned issues, in this work, we propose a Dual Mamba-enhanced Graph Convolutional Network (DMbaGCN), which is a novel framework that integrates Mamba into GNNs to address over-smoothing from both local and global perspectives. DMbaGCN consists of two modules: the Local State-Evolution Mamba (LSEMba) for local neighborhood aggregation and utilizing Mamba's selective state space modeling to capture node-specific representation dynamics across layers, and the Global Context-Aware Mamba (GCAMba) that leverages Mamba's global attention capabilities to incorporate global context for each node. By combining these components, DMbaGCN enhances node discriminability in deep GNNs, thereby mitigating over-smoothing. Extensive experiments on multiple benchmarks demonstrate the effectiveness and efficiency of our method.
Problem

Research questions and friction points this paper is trying to address.

Addresses over-smoothing in deep Graph Neural Networks through selective modeling
Captures node-specific representation dynamics across network layers
Incorporates global context while maintaining local neighborhood aggregation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates Mamba into GNNs for over-smoothing mitigation
Uses selective state space modeling for node-specific dynamics
Combines local neighborhood aggregation with global context awareness
X
Xingbo He
School of Artificial Intelligence Jilin University
Yili Wang
Yili Wang
Jilin University
Graph Neural Networks
Y
Yiwei Dai
School of Artificial Intelligence Jilin University
X
Xin Wang
School of Artificial Intelligence Jilin University