Tackling Dimensional Collapse toward Comprehensive Universal Domain Adaptation

📅 2024-10-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the performance collapse of partial domain matching (PDM) methods in universal domain adaptation (UniDA) under severe target-domain class scarcity—e.g., only 5% shared classes—identifying “dimensional collapse” as the root cause: target representations degenerate into a low-dimensional manifold, eroding discriminative structure. To mitigate this, we propose a self-supervised framework jointly enforcing contrastive alignment and uniformity regularization. Specifically, contrastive loss operates on unlabeled target data to achieve cross-domain semantic alignment; uniformity regularization preserves feature-space coverage; and shared-class discriminative features are explicitly decoupled. Evaluated across UniDA benchmarks with varying shared-class ratios, our method establishes new state-of-the-art performance, significantly improving transfer accuracy—especially under extreme scarcity.

Technology Category

Application Category

📝 Abstract
Universal Domain Adaptation (UniDA) addresses unsupervised domain adaptation where target classes may differ arbitrarily from source ones, except for a shared subset. An important approach, partial domain matching (PDM), aligns only shared classes but struggles in extreme cases where many source classes are absent in the target domain, underperforming the most naive baseline that trains on only source data. In this work, we identify that the failure of PDM for extreme UniDA stems from dimensional collapse (DC) in target representations. To address target DC, we propose to jointly leverage the alignment and uniformity techniques in modern self-supervised learning (SSL) on the unlabeled target data to preserve the intrinsic structure of the learned representations. Our experimental results confirm that SSL consistently advances PDM and delivers new state-of-the-art results across a broader benchmark of UniDA scenarios with different portions of shared classes, representing a crucial step toward truly comprehensive UniDA.
Problem

Research questions and friction points this paper is trying to address.

Addresses dimensional collapse in UniDA
Improves partial domain matching with SSL
Advances state-of-the-art in universal domain adaptation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Self-supervised learning techniques
Prevents dimensional collapse
Enhances partial domain matching
🔎 Similar Papers
No similar papers found.
H
Hung-Chieh Fang
National Taiwan University
P
Po-Yi Lu
National Taiwan University
Hsuan-Tien Lin
Hsuan-Tien Lin
Professor of Computer Science and Information Engineering, National Taiwan University
Machine LearningData Mining