GALA: A GlobAl-LocAl Approach for Multi-Source Active Domain Adaptation

📅 2025-10-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenge of sample selection in multi-source active domain adaptation (MS-ADA), where concurrent inter-class diversity and multi-source domain shifts impede reliable query selection. To tackle this, we propose GALA—a global-local joint selection strategy that synergistically integrates k-means-based global clustering with intra-cluster uncertainty-driven local selection. GALA introduces a parameter-free, plug-and-play sample selection framework requiring no additional trainable parameters. While maintaining computational efficiency, GALA significantly improves target-labeling efficiency: on three standard benchmarks, it achieves near fully supervised performance using only 1% labeled target data—substantially outperforming existing MS-ADA methods. Its core contribution lies in decoupling the modeling of cross-domain discrepancies from intra-class distribution heterogeneity, establishing a novel paradigm for multi-source active learning.

Technology Category

Application Category

📝 Abstract
Domain Adaptation (DA) provides an effective way to tackle target-domain tasks by leveraging knowledge learned from source domains. Recent studies have extended this paradigm to Multi-Source Domain Adaptation (MSDA), which exploits multiple source domains carrying richer and more diverse transferable information. However, a substantial performance gap still remains between adaptation-based methods and fully supervised learning. In this paper, we explore a more practical and challenging setting, named Multi-Source Active Domain Adaptation (MS-ADA), to further enhance target-domain performance by selectively acquiring annotations from the target domain. The key difficulty of MS-ADA lies in designing selection criteria that can jointly handle inter-class diversity and multi-source domain variation. To address these challenges, we propose a simple yet effective GALA strategy (GALA), which combines a global k-means clustering step for target-domain samples with a cluster-wise local selection criterion, effectively tackling the above two issues in a complementary manner. Our proposed GALA is plug-and-play and can be seamlessly integrated into existing DA frameworks without introducing any additional trainable parameters. Extensive experiments on three standard DA benchmarks demonstrate that GALA consistently outperforms prior active learning and active DA methods, achieving performance comparable to the fully-supervised upperbound while using only 1% of the target annotations.
Problem

Research questions and friction points this paper is trying to address.

Enhancing target-domain performance with selective annotations
Addressing inter-class diversity and multi-source domain variation
Achieving near-supervised performance with minimal target labels
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines global clustering with local selection
Plug-and-play integration into existing frameworks
Uses only 1% target annotations for near-supervised performance
🔎 Similar Papers
No similar papers found.
J
Juepeng Zheng
School of Artificial Intelligence, Sun Yat-Sen University, Zhuhai, China, also with the National Supercomputing Center in Shenzhen, Shenzhen, China
P
Peifeng Zhang
School of Artificial Intelligence, Sun Yat-Sen University, Zhuhai, China
Y
Yibin Wen
School of Artificial Intelligence, Sun Yat-Sen University, Zhuhai, China
Qingmei Li
Qingmei Li
Tsinghua University
Remote SensingSpatial Analysis
Y
Yang Zhang
School of Artificial Intelligence, Sun Yat-Sen University, Zhuhai, China
Haohuan Fu
Haohuan Fu
Tsinghua University