🤖 AI Summary
This work addresses the challenge of zero-shot node classification in textual attributed graphs under fully unsupervised settings. The authors propose a novel framework termed Zero-shot Prompt Tuning (ZPT), which introduces, for the first time, a Universal Bimodal Conditional Generator (UBCG) to jointly model graph structure and node text. Leveraging only class names, UBCG generates class-aware synthetic nodes, enabling zero-shot classification through continuous prompt tuning. By integrating graph-language pretraining, bimodal generation, and prompt learning, ZPT achieves substantial performance gains over state-of-the-art methods across multiple benchmark datasets. Ablation studies further confirm the effectiveness and necessity of the proposed bimodal generation mechanism.
📝 Abstract
Node classification is a fundamental problem in information retrieval with many real-world applications, such as community detection in social networks, grouping articles published online and product categorization in e-commerce. Zero-shot node classification in text-attributed graphs (TAGs) presents a significant challenge, particularly due to the absence of labeled data. In this paper, we propose a novel Zero-shot Prompt Tuning (ZPT) framework to address this problem by leveraging a Universal Bimodal Conditional Generator (UBCG). Our approach begins with pre-training a graph-language model to capture both the graph structure and the associated textual descriptions of each node. Following this, a conditional generative model is trained to learn the joint distribution of nodes in both graph and text modalities, enabling the generation of synthetic samples for each class based solely on the class name. These synthetic node and text embeddings are subsequently used to perform continuous prompt tuning, facilitating effective node classification in a zero-shot setting. Furthermore, we conduct extensive experiments on multiple benchmark datasets, demonstrating that our framework performs better than existing state-of-the-art baselines. We also provide ablation studies to validate the contribution of the bimodal generator. The code is provided at: https://github.com/Sethup123/ZPT.