ElimPCL: Eliminating Noise Accumulation with Progressive Curriculum Labeling for Source-Free Domain Adaptation

📅 2025-03-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In source-free domain adaptation (SFDA), pre-trained source models generate highly uncertain pseudo-labels for hard samples in the target domain, leading to noise accumulation and erroneous propagation in feature space. To address this, we propose ElimPCL: (1) a prototype-consistency-based progressive curriculum labeling mechanism that dynamically selects high-confidence samples; and (2) a novel dual-path MixUp feature augmentation strategy that fuses reliable and hard samples in feature space, enhancing discriminability of hard samples while suppressing noise interference. ElimPCL operates without access to source data or labels, requiring only a single forward pass on the target domain. Evaluated on multiple SFDA benchmarks, it consistently outperforms state-of-the-art methods, achieving up to a 3.4% absolute accuracy gain. The framework effectively mitigates both the accumulation and diffusion of pseudo-label noise.

Technology Category

Application Category

📝 Abstract
Source-Free Domain Adaptation (SFDA) aims to train a target model without source data, and the key is to generate pseudo-labels using a pre-trained source model. However, we observe that the source model often produces highly uncertain pseudo-labels for hard samples, particularly those heavily affected by domain shifts, leading to these noisy pseudo-labels being introduced even before adaptation and further reinforced through parameter updates. Additionally, they continuously influence neighbor samples through propagation in the feature space.To eliminate the issue of noise accumulation, we propose a novel Progressive Curriculum Labeling (ElimPCL) method, which iteratively filters trustworthy pseudo-labeled samples based on prototype consistency to exclude high-noise samples from training. Furthermore, a Dual MixUP technique is designed in the feature space to enhance the separability of hard samples, thereby mitigating the interference of noisy samples on their neighbors.Extensive experiments validate the effectiveness of ElimPCL, achieving up to a 3.4% improvement on challenging tasks compared to state-of-the-art methods.
Problem

Research questions and friction points this paper is trying to address.

Reducing noise in pseudo-labels for SFDA
Handling domain shifts in hard samples
Preventing noise propagation in feature space
Innovation

Methods, ideas, or system contributions that make the work stand out.

Progressive Curriculum Labeling filters noisy pseudo-labels
Dual MixUP enhances hard sample separability
Prototype consistency excludes high-noise samples
🔎 Similar Papers
No similar papers found.
Jie Cheng
Jie Cheng
Institute of Automation, Chinese Academy of Sciences
Reinforcement Learning
H
Hao Zheng
School of Computer Science and Engineering, Central South University, Changsha, China
M
Meiguang Zheng
School of Computer Science and Engineering, Central South University, Changsha, China
L
Lei Wang
School of Computer Science and Engineering, Central South University, Changsha, China
H
Hao Wu
School of Computer Science and Engineering, Central South University, Changsha, China
J
Jian Zhang
School of Computer Science and Engineering, Central South University, Changsha, China