🤖 AI Summary
In graph learning, noisy graph structures and node features degrade the performance of Graph Neural Networks (GNNs) for node classification. To address this, we propose the first unified framework that jointly performs graph rewiring and feature denoising. Our method leverages spectral resonance optimization—specifically, aligning the principal eigensubspace of the graph Laplacian with that of the denoised feature matrix—to achieve synergistic optimization of both components. Grounded in spectral graph theory, we formulate a non-convex joint objective and design an efficient heuristic algorithm applicable to both homophilic and heterophilic graphs. Extensive experiments on synthetic and real-world benchmarks demonstrate that our approach consistently outperforms existing graph rewiring methods, yielding average improvements of 3.2–7.8 percentage points in GNN node classification accuracy. These results validate the effectiveness and generalizability of joint structural and feature refinement.
📝 Abstract
In graph learning the graph and the node features both contain noisy information about the node labels. In this paper we propose joint denoising and rewiring (JDR)--an algorithm to jointly rewire the graph and denoise the features, which improves the performance of downstream node classification graph neural nets (GNNs). JDR improves the alignment between the leading eigenspaces of graph and feature matrices. To approximately solve the associated non-convex optimization problem we propose a heuristic that efficiently handles real-world graph datasets with multiple classes and different levels of homophily or heterophily. We theoretically justify JDR in a stylized setting and verify the effectiveness of our approach through extensive experiments on synthetic and real-world graph datasets. The results show that JDR consistently outperforms existing rewiring methods on node classification using GNNs as downstream models.