🤖 AI Summary
Few-shot continual relation extraction suffers from catastrophic forgetting and difficulty in knowledge consolidation, especially under data scarcity, where modeling relational semantics and mitigating overfitting remain challenging. To address this, we propose the first description-driven retrieval-based continual learning paradigm: leveraging large language models to generate structured relational descriptions, constructing a dual-encoder retrieval framework that jointly encodes class prototypes and semantic descriptions, and designing a reciprocal rank fusion (RRF)-based prediction mechanism for robust inference. Our method achieves significant improvements over state-of-the-art approaches across multiple benchmarks, demonstrating superior stability, strong generalization capability, and effective forgetting mitigation. It establishes a novel paradigm for continual relation learning in low-resource, dynamic environments.
📝 Abstract
Few-shot Continual Relation Extraction is a crucial challenge for enabling AI systems to identify and adapt to evolving relationships in dynamic real-world domains. Traditional memory-based approaches often overfit to limited samples, failing to reinforce old knowledge, with the scarcity of data in few-shot scenarios further exacerbating these issues by hindering effective data augmentation in the latent space. In this paper, we propose a novel retrieval-based solution, starting with a large language model to generate descriptions for each relation. From these descriptions, we introduce a bi-encoder retrieval training paradigm to enrich both sample and class representation learning. Leveraging these enhanced representations, we design a retrieval-based prediction method where each sample"retrieves"the best fitting relation via a reciprocal rank fusion score that integrates both relation description vectors and class prototypes. Extensive experiments on multiple datasets demonstrate that our method significantly advances the state-of-the-art by maintaining robust performance across sequential tasks, effectively addressing catastrophic forgetting.