Simple Network Graph Comparative Learning

📅 2026-01-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses two key challenges in graph contrastive learning for node classification: view distortion caused by data augmentation and heavy reliance on large numbers of negative samples. To overcome these limitations, the authors propose SNGCL, a novel approach that eliminates the need for traditional negative samples. Instead, SNGCL generates globally and locally smoothed feature matrices by stacking multiple layers of Laplacian smoothing filters, which are fed into the online and target branches of a Siamese network, respectively. An improved triplet reassembly loss is further designed to enhance intra-class compactness and inter-class separability. Extensive experiments demonstrate that SNGCL achieves competitive or state-of-the-art performance across multiple node classification benchmarks, validating its effectiveness and robustness.

Technology Category

Application Category

📝 Abstract
The effectiveness of contrastive learning methods has been widely recognized in the field of graph learning, especially in contexts where graph data often lack labels or are difficult to label. However, the application of these methods to node classification tasks still faces a number of challenges. First, existing data enhancement techniques may lead to significant differences from the original view when generating new views, which may weaken the relevance of the view and affect the efficiency of model training. Second, the vast majority of existing graph comparison learning algorithms rely on the use of a large number of negative samples. To address the above challenges, this study proposes a novel node classification contrast learning method called Simple Network Graph Comparative Learning (SNGCL). Specifically, SNGCL employs a superimposed multilayer Laplace smoothing filter as a step in processing the data to obtain global and local feature smoothing matrices, respectively, which are thus passed into the target and online networks of the siamese network, and finally employs an improved triple recombination loss function to bring the intra-class distance closer and the inter-class distance farther. We have compared SNGCL with state-of-the-art models in node classification tasks, and the experimental results show that SNGCL is strongly competitive in most tasks.
Problem

Research questions and friction points this paper is trying to address.

contrastive learning
node classification
data augmentation
negative samples
graph learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

contrastive learning
graph neural networks
Laplacian smoothing
siamese network
node classification
🔎 Similar Papers
No similar papers found.