Index-Aligned Query Distillation for Transformer-based Incremental Object Detection

📅 2025-08-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In transformer-based incremental object detection, Hungarian matching induces unstable cross-stage query correspondences, leading to semantic reshaping and severe forgetting of previously learned categories. This work identifies, for the first time, the fundamental inadequacy of Hungarian matching in knowledge distillation for incremental learning. To address this, we propose Index-Aligned Query Distillation (IA-QD): a hard alignment mechanism that fixes query indices across stages to ensure semantic consistency, thereby eliminating the dynamic reallocation inherent in Hungarian matching. Furthermore, we introduce a key-query selection strategy that selectively distills only those queries encoding historical category knowledge, effectively decoupling old and new tasks. Evaluated on standard benchmarks including COCO and PASCAL VOC, IA-QD significantly mitigates catastrophic forgetting and achieves state-of-the-art performance.

Technology Category

Application Category

📝 Abstract
Incremental object detection (IOD) aims to continuously expand the capability of a model to detect novel categories while preserving its performance on previously learned ones. When adopting a transformer-based detection model to perform IOD, catastrophic knowledge forgetting may inevitably occur, meaning the detection performance on previously learned categories may severely degenerate. Previous typical methods mainly rely on knowledge distillation (KD) to mitigate the catastrophic knowledge forgetting of transformer-based detection models. Specifically, they utilize Hungarian Matching to build a correspondence between the queries of the last-phase and current-phase detection models and align the classifier and regressor outputs between matched queries to avoid knowledge forgetting. However, we observe that in IOD task, Hungarian Matching is not a good choice. With Hungarian Matching, the query of the current-phase model may match different queries of the last-phase model at different iterations during KD. As a result, the knowledge encoded in each query may be reshaped towards new categories, leading to the forgetting of previously encoded knowledge of old categories. Based on our observations, we propose a new distillation approach named Index-Aligned Query Distillation (IAQD) for transformer-based IOD. Beyond using Hungarian Matching, IAQD establishes a correspondence between queries of the previous and current phase models that have the same index. Moreover, we perform index-aligned distillation only on partial queries which are critical for the detection of previous categories. In this way, IAQD largely preserves the previous semantic and spatial encoding capabilities without interfering with the learning of new categories. Extensive experiments on representative benchmarks demonstrate that IAQD effectively mitigates knowledge forgetting, achieving new state-of-the-art performance.
Problem

Research questions and friction points this paper is trying to address.

Mitigates catastrophic forgetting in incremental object detection
Addresses query misalignment in transformer-based detection models
Preserves old category knowledge while learning new ones
Innovation

Methods, ideas, or system contributions that make the work stand out.

Index-aligned query distillation for transformers
Partial query distillation on critical indices
Preserves semantic and spatial encoding capabilities
🔎 Similar Papers
No similar papers found.
M
Mingxiao Ma
Beihang University
S
Shunyao Zhu
Beihang University
Guoliang Kang
Guoliang Kang
Professor, Beihang University
Deep learning and its applications