DPCore: Dynamic Prompt Coreset for Continual Test-Time Adaptation

๐Ÿ“… 2024-06-15
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses convergence difficulties, catastrophic forgetting, and erroneous knowledge transfer in continual test-time adaptation (CTTA), caused by non-stationary target-domain shiftsโ€”varying in frequency and duration. To this end, we propose the first lightweight adaptive prompting method tailored for realistic dynamic environments. Our core innovation is a dynamic prompt core-set mechanism: it adaptively reuses or generates visual prompts based on similarity criteria and employs a lightweight learnable prompt encoder for parameter-efficient updates. Evaluated on four standard CTTA benchmarks, our method achieves state-of-the-art performance in both structured and unstructured dynamic scenarios. It reduces model parameters by 99% and inference latency by 64% compared to existing approaches, while significantly improving generalization robustness and adaptation efficiency.

Technology Category

Application Category

๐Ÿ“ Abstract
Continual Test-Time Adaptation (CTTA) seeks to adapt source pre-trained models to continually changing, unseen target domains. While existing CTTA methods assume structured domain changes with uniform durations, real-world environments often exhibit dynamic patterns where domains recur with varying frequencies and durations. Current approaches, which adapt the same parameters across different domains, struggle in such dynamic conditions-they face convergence issues with brief domain exposures, risk forgetting previously learned knowledge, or misapplying it to irrelevant domains. To remedy this, we propose DPCore, a method designed for robust performance across diverse domain change patterns while ensuring computational efficiency. DPCore integrates three key components: Visual Prompt Adaptation for efficient domain alignment, a Prompt Coreset for knowledge preservation, and a Dynamic Update mechanism that intelligently adjusts existing prompts for similar domains while creating new ones for substantially different domains. Extensive experiments on four benchmarks demonstrate that DPCore consistently outperforms various CTTA methods, achieving state-of-the-art performance in both structured and dynamic settings while reducing trainable parameters by 99% and computation time by 64% compared to previous approaches.
Problem

Research questions and friction points this paper is trying to address.

Adapts models to changing domains
Handles dynamic domain recurrence
Reduces computational resources required
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic Prompt Coreset
Visual Prompt Adaptation
Dynamic Update mechanism
๐Ÿ”Ž Similar Papers
No similar papers found.