Diffusion on Graph: Augmentation of Graph Structure for Node Classification

📅 2025-03-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing graph diffusion models focus on whole-graph generation and remain unexplored for structural enhancement in node-level tasks such as node classification and graph contrastive learning. This paper proposes DoG (Diffusion on Graph), the first framework to leverage graph diffusion modeling for *synthesizing node and edge structures on a given graph*, thereby enhancing node-level representation learning. Methodologically: (1) we design a Bi-Level Neighbor Mapping Decoder (BLND) that explicitly captures local neighborhood dependencies to improve generation efficiency; (2) we introduce a low-rank regularization constraint on the diffusion process to mitigate structural noise. Extensive experiments on multiple benchmark datasets demonstrate that DoG significantly improves performance in semi-supervised node classification and graph contrastive learning. The source code is publicly available.

Technology Category

Application Category

📝 Abstract
Graph diffusion models have recently been proposed to synthesize entire graphs, such as molecule graphs. Although existing methods have shown great performance in generating entire graphs for graph-level learning tasks, no graph diffusion models have been developed to generate synthetic graph structures, that is, synthetic nodes and associated edges within a given graph, for node-level learning tasks. Inspired by the research in the computer vision literature using synthetic data for enhanced performance, we propose Diffusion on Graph (DoG), which generates synthetic graph structures to boost the performance of GNNs. The synthetic graph structures generated by DoG are combined with the original graph to form an augmented graph for the training of node-level learning tasks, such as node classification and graph contrastive learning (GCL). To improve the efficiency of the generation process, a Bi-Level Neighbor Map Decoder (BLND) is introduced in DoG. To mitigate the adverse effect of the noise introduced by the synthetic graph structures, a low-rank regularization method is proposed for the training of graph neural networks (GNNs) on the augmented graphs. Extensive experiments on various graph datasets for semi-supervised node classification and graph contrastive learning have been conducted to demonstrate the effectiveness of DoG with low-rank regularization. The code of DoG is available at https://github.com/Statistical-Deep-Learning/DoG.
Problem

Research questions and friction points this paper is trying to address.

Generates synthetic graph structures for node-level tasks.
Enhances GNN performance using augmented graph structures.
Introduces low-rank regularization to reduce synthetic noise impact.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generates synthetic graph structures for GNNs
Introduces Bi-Level Neighbor Map Decoder
Uses low-rank regularization for noise reduction
🔎 Similar Papers
No similar papers found.