Information Maximization for Long-Tailed Semi-Supervised Domain Generalization

📅 2026-03-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the significant performance degradation of semi-supervised domain generalization (SSDG) under long-tailed class distributions by proposing a novel objective function, IMaX. IMaX introduces, for the first time, a modified mutual information maximization mechanism tailored to long-tailed SSDG scenarios. It enhances robustness by maximizing the mutual information between features and latent labels while incorporating α-entropy regularization to mitigate class bias inherent in standard marginal entropy objectives, thereby adapting effectively to arbitrary class distributions. The method seamlessly integrates into existing SSDG frameworks and demonstrates consistent performance gains across two image modalities, validating its effectiveness and robustness in handling long-tailed data distributions.

Technology Category

Application Category

📝 Abstract
Semi-supervised domain generalization (SSDG) has recently emerged as an appealing alternative to tackle domain generalization when labeled data is scarce but unlabeled samples across domains are abundant. In this work, we identify an important limitation that hampers the deployment of state-of-the-art methods on more challenging but practical scenarios. In particular, state-of-the-art SSDG severely suffers in the presence of long-tailed class distributions, an arguably common situation in real-world settings. To alleviate this limitation, we propose IMaX, a simple yet effective objective based on the well-known InfoMax principle adapted to the SSDG scenario, where the Mutual Information (MI) between the learned features and latent labels is maximized, constrained by the supervision from the labeled samples. Our formulation integrates an {\alpha}-entropic objective, which mitigates the class-balance bias encoded in the standard marginal entropy term of the MI, thereby better handling arbitrary class distributions. IMaX can be seamlessly plugged into recent state-of-the-art SSDG, consistently enhancing their performance, as demonstrated empirically across two different image modalities.
Problem

Research questions and friction points this paper is trying to address.

long-tailed
semi-supervised domain generalization
class imbalance
domain generalization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Information Maximization
Long-Tailed Distribution
Semi-Supervised Domain Generalization
Mutual Information
α-Entropic Objective
🔎 Similar Papers
No similar papers found.