SloMo-Fast: Slow-Momentum and Fast-Adaptive Teachers for Source-Free Continual Test-Time Adaptation

πŸ“… 2025-11-23
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address long-term catastrophic forgetting and degraded generalization in source-free continual test-time adaptation (CTTA) under dynamically shifting target domains, this paper proposes SloMo-Fastβ€”a source-free dual-teacher framework. The Slow-Teacher preserves long-term knowledge via momentum-based parameter updates, while the Fast-Teacher rapidly adapts to emerging domain shifts through adaptive optimization. Their synergistic collaboration enables cumulative knowledge integration and unsupervised continual adaptation without accessing source data or class prototypes. We further introduce Cyclic-TTA, a novel benchmark simulating periodic domain shifts to better evaluate temporal robustness. Extensive experiments demonstrate that SloMo-Fast significantly outperforms state-of-the-art methods on Cyclic-TTA and ten established CTTA benchmarks. Results validate its strong robustness to both evolving and revisiting domains, sustained adaptability, and superior generalization capability under strict source-free constraints.

Technology Category

Application Category

πŸ“ Abstract
Continual Test-Time Adaptation (CTTA) is crucial for deploying models in real-world applications with unseen, evolving target domains. Existing CTTA methods, however, often rely on source data or prototypes, limiting their applicability in privacy-sensitive and resource-constrained settings. Additionally, these methods suffer from long-term forgetting, which degrades performance on previously encountered domains as target domains shift. To address these challenges, we propose SloMo-Fast, a source-free, dual-teacher CTTA framework designed for enhanced adaptability and generalization. It includes two complementary teachers: the Slow-Teacher, which exhibits slow forgetting and retains long-term knowledge of previously encountered domains to ensure robust generalization, and the Fast-Teacher rapidly adapts to new domains while accumulating and integrating knowledge across them. This framework preserves knowledge of past domains and adapts efficiently to new ones. We also introduce Cyclic Test-Time Adaptation (Cyclic-TTA), a novel CTTA benchmark that simulates recurring domain shifts. Our extensive experiments demonstrate that SloMo-Fast consistently outperforms state-of-the-art methods across Cyclic-TTA, as well as ten other CTTA settings, highlighting its ability to both adapt and generalize across evolving and revisited domains.
Problem

Research questions and friction points this paper is trying to address.

Addresses source-free continual adaptation to evolving domains
Mitigates long-term forgetting in privacy-sensitive environments
Enhances model generalization across recurring domain shifts
Innovation

Methods, ideas, or system contributions that make the work stand out.

Slow-momentum teacher prevents long-term forgetting
Fast-adaptive teacher enables rapid domain adaptation
Dual-teacher framework operates without source data
πŸ”Ž Similar Papers
No similar papers found.
M
Md Akil Raihan Iftee
Center for Computational & Data Sciences, Independent University, Bangladesh
M
Mir Sazzat Hossain
Center for Computational & Data Sciences, Independent University, Bangladesh
R
Rakibul Hasan Rajib
University of Central Florida, USA
Tariq Iqbal
Tariq Iqbal
Assistant Professor, University of Virginia
RoboticsArtificial IntelligenceHuman-Robot Interaction
Md Mofijul Islam
Md Mofijul Islam
Applied Scientist, AWS GenAI
Multimodal Machine LearningMultitask LearningVisionNLPMulti-agent Planning
M
M Ashraful Amin
Center for Computational & Data Sciences, Independent University, Bangladesh
Amin Ahsan Ali
Amin Ahsan Ali
Independent University, Bangladesh
Machine LearningData SciencemHealth
A
AKM Mahbubur Rahman
Center for Computational & Data Sciences, Independent University, Bangladesh