🤖 AI Summary
This work addresses key limitations of Graph Continual Incremental Learning (GCIL) in open-set scenarios: the restrictive closed-set assumption fails to handle unknown classes emerging during inference, and catastrophic forgetting of old classes conflicts with reliable unknown-class identification. We propose the first open-set GCIL framework. Methodologically, we design a prototype-conditional variational autoencoder to generate high-fidelity pseudo-samples for replay-free knowledge preservation; introduce a prototype-based hyperspherical classification loss to enforce intra-class compactness and inter-class separability; and establish an explicit open-set recognition mechanism via prototype-aware rejection regions. Evaluated on five benchmark graph datasets, our approach significantly outperforms existing GCIL and open-set GNN methods, achieving superior trade-offs between knowledge retention and unknown-class detection. This advances the capability of graph neural networks for continual learning and anomaly detection in dynamically evolving environments.
📝 Abstract
Graph class-incremental learning (GCIL) allows graph neural networks (GNNs) to adapt to evolving graph analytical tasks by incrementally learning new class knowledge while retaining knowledge of old classes. Existing GCIL methods primarily focus on a closed-set assumption, where all test samples are presumed to belong to previously known classes. Such an assumption restricts their applicability in real-world scenarios, where unknown classes naturally emerge during inference, and are absent during training. In this paper, we explore a more challenging open-set graph class-incremental learning scenario with two intertwined challenges: catastrophic forgetting of old classes, which impairs the detection of unknown classes, and inadequate open-set recognition, which destabilizes the retention of learned knowledge. To address the above problems, a novel OGCIL framework is proposed, which utilizes pseudo-sample embedding generation to effectively mitigate catastrophic forgetting and enable robust detection of unknown classes. To be specific, a prototypical conditional variational autoencoder is designed to synthesize node embeddings for old classes, enabling knowledge replay without storing raw graph data. To handle unknown classes, we employ a mixing-based strategy to generate out-of-distribution (OOD) samples from pseudo in-distribution and current node embeddings. A novel prototypical hypersphere classification loss is further proposed, which anchors in-distribution embeddings to their respective class prototypes, while repelling OOD embeddings away. Instead of assigning all unknown samples into one cluster, our proposed objective function explicitly models them as outliers through prototype-aware rejection regions, ensuring a robust open-set recognition. Extensive experiments on five benchmarks demonstrate the effectiveness of OGCIL over existing GCIL and open-set GNN methods.