Mitigating Over-Squashing in Graph Neural Networks by Spectrum-Preserving Sparsification

📅 2025-06-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Graph neural networks (GNNs) suffer from structural bottlenecks that impede long-range information propagation—commonly termed “over-compression.” Existing graph rewiring methods often compromise original graph properties or introduce extra edges, exacerbating over-smoothing and increasing computational overhead. To address this, we propose a *spectral-preserving graph sparsification and rewiring framework*. First, we explicitly incorporate Laplacian spectral similarity constraints into the sparse rewiring objective, enhancing global connectivity without increasing edge count. Second, we design an optimization algorithm grounded in spectral perturbation analysis and integrate a message-passing topology enhancement mechanism. Evaluated on multiple benchmark datasets, our method achieves significantly higher classification accuracy than state-of-the-art approaches. Moreover, it improves Laplacian spectrum preservation by over 40% and reduces average edge count by 30%, demonstrating superior trade-offs among expressivity, efficiency, and structural fidelity.

Technology Category

Application Category

📝 Abstract
The message-passing paradigm of Graph Neural Networks often struggles with exchanging information across distant nodes typically due to structural bottlenecks in certain graph regions, a limitation known as extit{over-squashing}. To reduce such bottlenecks, extit{graph rewiring}, which modifies graph topology, has been widely used. However, existing graph rewiring techniques often overlook the need to preserve critical properties of the original graph, e.g., extit{spectral properties}. Moreover, many approaches rely on increasing edge count to improve connectivity, which introduces significant computational overhead and exacerbates the risk of over-smoothing. In this paper, we propose a novel graph rewiring method that leverages extit{spectrum-preserving} graph extit{sparsification}, for mitigating over-squashing. Our method generates graphs with enhanced connectivity while maintaining sparsity and largely preserving the original graph spectrum, effectively balancing structural bottleneck reduction and graph property preservation. Experimental results validate the effectiveness of our approach, demonstrating its superiority over strong baseline methods in classification accuracy and retention of the Laplacian spectrum.
Problem

Research questions and friction points this paper is trying to address.

Addresses over-squashing in Graph Neural Networks
Preserves spectral properties during graph rewiring
Balances connectivity and sparsity without over-smoothing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Spectrum-preserving graph sparsification for rewiring
Enhances connectivity while maintaining graph sparsity
Balances bottleneck reduction and spectral property preservation
🔎 Similar Papers
No similar papers found.