π€ AI Summary
To address geometric manifold degradation and catastrophic forgetting in source-free continual domain adaptation (SFDA), this paper proposes a novel framework integrating self-supervised pretraining with geometric manifold alignment. Unlike existing approaches relying on L2-normalized cosine similarity, we pioneer the incorporation of self-supervised contrastive learning into source-free continual adaptation to preserve the intrinsic geometric structure of the source modelβs latent manifold. We further introduce a spatial similarity loss to enforce cross-domain geometric consistency and adopt an exponential moving average (EMA) strategy for parameter updates to mitigate forgetting. Extensive experiments on multiple benchmark datasets demonstrate that our method consistently outperforms state-of-the-art SFDA approaches, validating its effectiveness in optimizing feature space geometry and facilitating robust knowledge transfer across domains.
π Abstract
Source-Free Domain Adaptation (SFDA) addresses the challenge of adapting a model to a target domain without access to the data of the source domain. Prevailing methods typically start with a source model pre-trained with full supervision and distill the knowledge by aligning instance-level features. However, these approaches, relying on cosine similarity over L2-normalized feature vectors, inadvertently discard crucial geometric information about the latent manifold of the source model. We introduce Self-supervised Continual Domain Adaptation (SCoDA) to address these limitations. We make two key departures from standard practice: first, we avoid the reliance on supervised pre-training by initializing the proposed framework with a teacher model pre-trained entirely via self-supervision (SSL). Second, we adapt the principle of geometric manifold alignment to the SFDA setting. The student is trained with a composite objective combining instance-level feature matching with a Space Similarity Loss. To combat catastrophic forgetting, the teacher's parameters are updated via an Exponential Moving Average (EMA) of the student's parameters. Extensive experiments on benchmark datasets demonstrate that SCoDA significantly outperforms state-of-the-art SFDA methods.