๐ค AI Summary
To address the reliance on large-scale labeled data and poor generalization in source-free domain adaptation (SFDA) and person re-identification (ReID), this paper proposes a dual-region decoupled augmentation method: random noise perturbation is applied to foreground regions, while semantic-aware block-wise spatial rearrangement is performed on background regionsโenabling structured, semantics-preserving data augmentation. This is the first approach to explicitly decouple and jointly model foreground and background augmentations, effectively mitigating domain shift and overfitting under limited target-domain samples. Under SFDA single- and multi-target settings on PACS, our method improves classification accuracy by 3.2% and 2.8%, respectively. On Market-1501 and DukeMTMC-reID, it boosts mAP by 4.1โ5.3% over conventional augmentation strategies, significantly enhancing model robustness and cross-domain adaptability.
๐ Abstract
This paper introduces a novel dual-region augmentation approach designed to reduce reliance on large-scale labeled datasets while improving model robustness and adaptability across diverse computer vision tasks, including source-free domain adaptation (SFDA) and person re-identification (ReID). Our method performs targeted data transformations by applying random noise perturbations to foreground objects and spatially shuffling background patches. This effectively increases the diversity of the training data, improving model robustness and generalization. Evaluations on the PACS dataset for SFDA demonstrate that our augmentation strategy consistently outperforms existing methods, achieving significant accuracy improvements in both single-target and multi-target adaptation settings. By augmenting training data through structured transformations, our method enables model generalization across domains, providing a scalable solution for reducing reliance on manually annotated datasets. Furthermore, experiments on Market-1501 and DukeMTMC-reID datasets validate the effectiveness of our approach for person ReID, surpassing traditional augmentation techniques.