🤖 AI Summary
To address performance degradation in heterogeneous graph neural networks caused by misalignment between pretraining objectives and downstream tasks, this paper proposes the first graph-level multi-task prompt learning framework tailored for heterogeneous graphs. The method unifies task formats, introduces a graph-level contrastive pretraining mechanism, and incorporates a learnable heterogeneous feature prompting module—collectively bridging the task gap and enhancing semantic representation capability. By synergistically integrating prompt learning, contrastive learning, and multi-task learning, the framework jointly models heterogeneous structural and semantic information. Extensive experiments on multiple public benchmarks demonstrate that the proposed approach significantly outperforms state-of-the-art baselines across diverse downstream tasks—including node classification, link prediction, and graph classification—exhibiting superior generalization and adaptability to heterogeneous graph domains.
📝 Abstract
The pre-training and fine-tuning methods have gained widespread attention in the field of heterogeneous graph neural networks due to their ability to leverage large amounts of unlabeled data during the pre-training phase, allowing the model to learn rich structural features. However, these methods face the issue of a mismatch between the pre-trained model and downstream tasks, leading to suboptimal performance in certain application scenarios. Prompt learning methods have emerged as a new direction in heterogeneous graph tasks, as they allow flexible adaptation of task representations to address target inconsistency. Building on this idea, this paper proposes a novel multi-task prompt framework for the heterogeneous graph domain, named HGMP. First, to bridge the gap between the pre-trained model and downstream tasks, we reformulate all downstream tasks into a unified graph-level task format. Next, we address the limitations of existing graph prompt learning methods, which struggle to integrate contrastive pre-training strategies in the heterogeneous graph domain. We design a graph-level contrastive pre-training strategy to better leverage heterogeneous information and enhance performance in multi-task scenarios. Finally, we introduce heterogeneous feature prompts, which enhance model performance by refining the representation of input graph features. Experimental results on public datasets show that our proposed method adapts well to various tasks and significantly outperforms baseline methods.