๐ค AI Summary
To address catastrophic forgetting and long-tailed distribution-induced performance degradation in class-incremental object detection, this paper pioneers the integration of neural collapse theory into incremental detection frameworks, proposing an efficient continual learning method built upon the DETR architecture. Our approach jointly incorporates neural collapse regularization, hierarchical category relationship modeling, and a lightweight incremental adaptation moduleโthereby mitigating forgetting while enhancing generalization for few-shot classes. Evaluated on COCO and PASCAL VOC benchmarks, the method achieves new state-of-the-art performance: it improves inference speed by 32%, boosts mAP for few-shot categories by 5.8 percentage points, and maintains high accuracy with practical efficiency. The proposed framework thus bridges theoretical insight from neural collapse with scalable, real-world continual detection requirements.
๐ Abstract
Recently, object detection models have witnessed notable performance improvements, particularly with transformer-based models. However, new objects frequently appear in the real world, requiring detection models to continually learn without suffering from catastrophic forgetting. Although Incremental Object Detection (IOD) has emerged to address this challenge, these existing models are still not practical due to their limited performance and prolonged inference time. In this paper, we introduce a novel framework for IOD, called Hier-DETR: Hierarchical Neural Collapse Detection Transformer, ensuring both efficiency and competitive performance by leveraging Neural Collapse for imbalance dataset and Hierarchical relation of classes' labels.