Non-Homophilic Graph Pre-Training and Prompt Learning

📅 2024-08-22
🏛️ arXiv.org
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
Existing graph neural network (GNN) pre-training and prompt learning methods overlook the dynamic evolution of node similarity/dissimilarity, particularly under non-homophilous graph structures—where homophilous and heterophilous patterns coexist—leading to suboptimal performance. Method: We propose ProNoG, the first pre-training and prompt learning framework tailored for non-homophilous graphs. It systematically analyzes how non-homophily affects pre-training objectives, introduces a conditional node-aware prompting mechanism to capture node-level heterophilous patterns, and adopts a dual-path design integrating theoretically grounded interpretability analysis with data-driven optimization. Contribution/Results: Evaluated on 10 public benchmarks, ProNoG consistently outperforms state-of-the-art methods, achieving an average accuracy gain of 7.2% on strongly heterophilous graphs. Ablation studies validate both the effectiveness and generalizability of node-level prompts, establishing ProNoG as a principled and empirically robust solution for non-homophilous graph representation learning.

Technology Category

Application Category

📝 Abstract
Graphs are ubiquitous for modeling complex relationships between objects across various fields. Graph neural networks (GNNs) have become a mainstream technique for graph-based applications, but their performance heavily relies on abundant labeled data. To reduce labeling requirement, pre-training and prompt learning has become a popular alternative. However, most existing prompt methods do not differentiate homophilic and heterophilic characteristics of real-world graphs. In particular, many real-world graphs are non-homophilic, not strictly or uniformly homophilic with mixing homophilic and heterophilic patterns, exhibiting varying non-homophilic characteristics across graphs and nodes. In this paper, we propose ProNoG, a novel pre-training and prompt learning framework for such non-homophilic graphs. First, we analyze existing graph pre-training methods, providing theoretical insights into the choice of pre-training tasks. Second, recognizing that each node exhibits unique non-homophilic characteristics, we propose a conditional network to characterize the node-specific patterns in downstream tasks. Finally, we thoroughly evaluate and analyze ProNoG through extensive experiments on ten public datasets.
Problem

Research questions and friction points this paper is trying to address.

Graph Neural Networks
Pre-training
Prompt Learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Pretraining
Graph Neural Networks
Node-specific Dynamics
X
Xing-Xing Yu
Singapore Management University, Singapore
J
Jie Zhang
National University of Singapore, Singapore
Y
Yuan Fang
Singapore Management University, Singapore
Renhe Jiang
Renhe Jiang
The University of Tokyo
AISpatio-temporal Data MiningHuman MobilityGraph LearningTime Series Forecasting