🤖 AI Summary
To address catastrophic forgetting and non-entity type semantic drift in continual named entity recognition (CNER), this paper proposes a generative continual learning framework that reformulates CNER as a sequence generation task of entity triplets (type, start, end), leveraging pre-trained seq2seq models for incremental learning. We introduce a novel type-aware confidence-based pseudo-labeling strategy and triplet-level knowledge distillation to jointly mitigate forgetting and semantic shift. Crucially, our approach operates without access to historical data. Extensive experiments on two benchmark datasets under multiple continual learning settings demonstrate consistent and significant improvements over state-of-the-art methods. Notably, our model achieves the smallest performance gap relative to full-supervision baselines—marking the first successful realization of an efficient, stable, and scalable generative paradigm for CNER.
📝 Abstract
Traditional named entity recognition (NER) aims to identify text mentions into pre-defined entity types. Continual Named Entity Recognition (CNER) is introduced since entity categories are continuously increasing in various real-world scenarios. However, existing continual learning (CL) methods for NER face challenges of catastrophic forgetting and semantic shift of non-entity type. In this paper, we propose GenCNER, a simple but effective Generative framework for CNER to mitigate the above drawbacks. Specifically, we skillfully convert the CNER task into sustained entity triplet sequence generation problem and utilize a powerful pre-trained seq2seq model to solve it. Additionally, we design a type-specific confidence-based pseudo labeling strategy along with knowledge distillation (KD) to preserve learned knowledge and alleviate the impact of label noise at the triplet level. Experimental results on two benchmark datasets show that our framework outperforms previous state-of-the-art methods in multiple CNER settings, and achieves the smallest gap compared with non-CL results.