Overcoming Prompts Pool Confusion via Parameterized Prompt for Incremental Object Detection

📅 2025-10-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing prompt-pool-based incremental object detection (IOD) methods assume mutually exclusive categories across tasks, overlooking the natural co-occurrence of classes in images—leading to prompt confusion caused by unlabeled old-class objects in new-task images. To address this, we propose P²IOD, the first IOD framework incorporating learnable neural networks to construct structured, evolvable parameterized prompts. We design a structurally constrained prompt fusion mechanism that balances dynamic prompt updating with historical stability. Integrated end-to-end into a pre-trained detector, P²IOD effectively mitigates semantic interference in co-occurrence scenarios. Extensive experiments on PASCAL VOC2007 and MS COCO demonstrate that P²IOD significantly outperforms existing IOD methods, achieving state-of-the-art performance in both backward/forward compatibility and overall detection accuracy.

Technology Category

Application Category

📝 Abstract
Recent studies have demonstrated that incorporating trainable prompts into pretrained models enables effective incremental learning. However, the application of prompts in incremental object detection (IOD) remains underexplored. Existing prompts pool based approaches assume disjoint class sets across incremental tasks, which are unsuitable for IOD as they overlook the inherent co-occurrence phenomenon in detection images. In co-occurring scenarios, unlabeled objects from previous tasks may appear in current task images, leading to confusion in prompts pool. In this paper, we hold that prompt structures should exhibit adaptive consolidation properties across tasks, with constrained updates to prevent catastrophic forgetting. Motivated by this, we introduce Parameterized Prompts for Incremental Object Detection (P$^2$IOD). Leveraging neural networks global evolution properties, P$^2$IOD employs networks as the parameterized prompts to adaptively consolidate knowledge across tasks. To constrain prompts structure updates, P$^2$IOD further engages a parameterized prompts fusion strategy. Extensive experiments on PASCAL VOC2007 and MS COCO datasets demonstrate that P$^2$IOD's effectiveness in IOD and achieves the state-of-the-art performance among existing baselines.
Problem

Research questions and friction points this paper is trying to address.

Addresses prompts pool confusion in incremental object detection
Handles co-occurring objects across tasks to prevent forgetting
Introduces parameterized prompts for adaptive knowledge consolidation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Parameterized prompts adaptively consolidate knowledge across tasks
Neural networks serve as parameterized prompts for global evolution
Parameterized prompts fusion strategy constrains structure updates
🔎 Similar Papers
No similar papers found.
Z
Zijia An
Institute of Computing Technology, Chinese Academy of Sciences
B
Boyu Diao
Institute of Computing Technology, Chinese Academy of Sciences
Ruiqi Liu
Ruiqi Liu
Texas Tech University
nonparametric methodsmachine learningeconometrics
Libo Huang
Libo Huang
Institute of Computing Technology, Chinese Academy of Sciences
Continual LearningNeural Data Analysis
Chuanguang Yang
Chuanguang Yang
Institute of Computing Technology, Chinese Academy of Sciences
Computer VisionKnowledge DistillationRepresentation Learning
F
Fei Wang
Institute of Computing Technology, Chinese Academy of Sciences
Zhulin An
Zhulin An
Institute Of Computing Technology Chinese Academy Of Sciences
Automatic Deep LearningLifelong Learning
Y
Yongjun Xu
Institute of Computing Technology, Chinese Academy of Sciences