Prompt-Driven Continual Graph Learning

📅 2025-02-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses catastrophic forgetting in continual graph learning (CGL) on dynamic graphs. We propose a prompt-driven paradigm that requires neither model parameter updates nor historical data replay. Our core innovation is to freeze the graph neural network weights and learn only lightweight, hierarchically structured prompts. Specifically, we design a dual-level prompting mechanism—operating jointly on node features and graph topology—and introduce a node-wise personalized prompt generator to enable task-adaptive learning and knowledge consolidation. The method incurs only constant memory overhead, inherently avoiding privacy risks and scalability bottlenecks associated with data storage or model expansion. Evaluated on four standard benchmarks, our approach achieves average accuracy improvements of 3.2%–7.8% over state-of-the-art methods, reduces memory consumption by over 92%, and, for the first time, enables long-term continual learning on ultra-large-scale graphs.

Technology Category

Application Category

📝 Abstract
Continual Graph Learning (CGL), which aims to accommodate new tasks over evolving graph data without forgetting prior knowledge, is garnering significant research interest. Mainstream solutions adopt the memory replay-based idea, ie, caching representative data from earlier tasks for retraining the graph model. However, this strategy struggles with scalability issues for constantly evolving graphs and raises concerns regarding data privacy. Inspired by recent advancements in the prompt-based learning paradigm, this paper introduces a novel prompt-driven continual graph learning (PROMPTCGL) framework, which learns a separate prompt for each incoming task and maintains the underlying graph neural network model fixed. In this way, PROMPTCGL naturally avoids catastrophic forgetting of knowledge from previous tasks. More specifically, we propose hierarchical prompting to instruct the model from both feature- and topology-level to fully address the variability of task graphs in dynamic continual learning. Additionally, we develop a personalized prompt generator to generate tailored prompts for each graph node while minimizing the number of prompts needed, leading to constant memory consumption regardless of the graph scale. Extensive experiments on four benchmarks show that PROMPTCGL achieves superior performance against existing CGL approaches while significantly reducing memory consumption. Our code is available at https://github.com/QiWang98/PromptCGL.
Problem

Research questions and friction points this paper is trying to address.

Addresses catastrophic forgetting in graph learning
Enhances scalability for evolving graph data
Reduces memory consumption in continual learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Prompt-driven learning framework
Hierarchical prompting strategy
Personalized prompt generator
🔎 Similar Papers
No similar papers found.
Q
Qi Wang
Beijing Institute of Technology, Zhuhai, China
Tianfei Zhou
Tianfei Zhou
Beijing Institute of Technology | ETH Zurich
Artificial IntelligenceMedical AIComputer Vision
Y
Ye Yuan
School of Computer Science and Technology, Beijing Institute of Technology, Beijing, China
Rui Mao
Rui Mao
Nanyang Technological University
Computational LinguisticsCognitive ComputingMetaphorQuantitative FinanceNeurosymbolic AI