🤖 AI Summary
To address the problem that noisy and anomalous edge weights in weighted graphs distort fine-grained node relationships and degrade classification performance, this paper proposes a joint optimization framework. First, it reconstructs GAT attention coefficients by explicitly integrating both node features and edge weights. Second, it introduces learnable graph structure optimization to jointly infer edge weights and a sparse graph topology. Third, it designs an attention sparsification mechanism and an adapted denoising variant of the InfoNCE loss. This work is the first to achieve end-to-end joint optimization of edge weight learning and graph structure learning. Extensive experiments on multiple benchmark datasets demonstrate an average 17.8% improvement in Micro-F1 score over state-of-the-art baselines, confirming substantial gains in robustness and discriminative capability.
📝 Abstract
Node classification in graphs aims to predict the categories of unlabeled nodes by utilizing a small set of labeled nodes. However, weighted graphs often contain noisy edges and anomalous edge weights, which can distort fine-grained relationships between nodes and hinder accurate classification. We propose the Edge Weight-aware Graph Structure Learning (EWGSL) method, which combines weight learning and graph structure learning to address these issues. EWGSL improves node classification by redefining attention coefficients in graph attention networks to incorporate node features and edge weights. It also applies graph structure learning to sparsify attention coefficients and uses a modified InfoNCE loss function to enhance performance by adapting to denoised graph weights. Extensive experimental results show that EWGSL has an average Micro-F1 improvement of 17.8% compared with the best baseline.