Towards Effective Open-set Graph Class-incremental Learning

📅 2025-07-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses key limitations of Graph Continual Incremental Learning (GCIL) in open-set scenarios: the restrictive closed-set assumption fails to handle unknown classes emerging during inference, and catastrophic forgetting of old classes conflicts with reliable unknown-class identification. We propose the first open-set GCIL framework. Methodologically, we design a prototype-conditional variational autoencoder to generate high-fidelity pseudo-samples for replay-free knowledge preservation; introduce a prototype-based hyperspherical classification loss to enforce intra-class compactness and inter-class separability; and establish an explicit open-set recognition mechanism via prototype-aware rejection regions. Evaluated on five benchmark graph datasets, our approach significantly outperforms existing GCIL and open-set GNN methods, achieving superior trade-offs between knowledge retention and unknown-class detection. This advances the capability of graph neural networks for continual learning and anomaly detection in dynamically evolving environments.

Technology Category

Application Category

📝 Abstract
Graph class-incremental learning (GCIL) allows graph neural networks (GNNs) to adapt to evolving graph analytical tasks by incrementally learning new class knowledge while retaining knowledge of old classes. Existing GCIL methods primarily focus on a closed-set assumption, where all test samples are presumed to belong to previously known classes. Such an assumption restricts their applicability in real-world scenarios, where unknown classes naturally emerge during inference, and are absent during training. In this paper, we explore a more challenging open-set graph class-incremental learning scenario with two intertwined challenges: catastrophic forgetting of old classes, which impairs the detection of unknown classes, and inadequate open-set recognition, which destabilizes the retention of learned knowledge. To address the above problems, a novel OGCIL framework is proposed, which utilizes pseudo-sample embedding generation to effectively mitigate catastrophic forgetting and enable robust detection of unknown classes. To be specific, a prototypical conditional variational autoencoder is designed to synthesize node embeddings for old classes, enabling knowledge replay without storing raw graph data. To handle unknown classes, we employ a mixing-based strategy to generate out-of-distribution (OOD) samples from pseudo in-distribution and current node embeddings. A novel prototypical hypersphere classification loss is further proposed, which anchors in-distribution embeddings to their respective class prototypes, while repelling OOD embeddings away. Instead of assigning all unknown samples into one cluster, our proposed objective function explicitly models them as outliers through prototype-aware rejection regions, ensuring a robust open-set recognition. Extensive experiments on five benchmarks demonstrate the effectiveness of OGCIL over existing GCIL and open-set GNN methods.
Problem

Research questions and friction points this paper is trying to address.

Address catastrophic forgetting in open-set graph class-incremental learning
Improve open-set recognition for unknown class detection
Enable knowledge replay without storing raw graph data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Pseudo-sample embedding generation mitigates catastrophic forgetting
Mixing-based strategy generates out-of-distribution samples
Prototypical hypersphere loss anchors and repels embeddings
🔎 Similar Papers
2024-07-27IEEE Transactions on Pattern Analysis and Machine IntelligenceCitations: 2
J
Jiazhen Chen
University of Waterloo
Z
Zheng Ma
University of Waterloo
S
Sichao Fu
Huazhong University of Science and Technology
M
Mingbin Feng
University of Waterloo
Tony S. Wirjanto
Tony S. Wirjanto
Department of Statistics & Actuarial Science, University of Waterloo
Financial Econometrics/Time SeriesFinancial MathematicsComputational Finance
Weihua Ou
Weihua Ou
Guizhou Normal University
Computer Vision and Artificial Intenlligence