🤖 AI Summary
To address the severe degradation in tail-class performance caused by the coexistence of non-IID and long-tailed data distributions in federated learning, this paper proposes the first class-aware prompt tuning framework for vision-language models (VLMs). Methodologically, we introduce a dual-prompt coordination mechanism—comprising a generic prompt and a class-aware prompt—to jointly facilitate global knowledge aggregation and enhance discriminability for tail classes. Furthermore, we propose a heterogeneity-aware client clustering strategy based on distributional similarity to enable data-driven knowledge sharing while preserving tail-class representations. Technically, our approach integrates CLIP fine-tuning, learnable prompt optimization, federated aggregation, and a tailored long-tailed loss function. Extensive experiments on multiple long-tailed benchmark datasets demonstrate a 12.7% improvement in average tail-class accuracy, with no compromise to overall model accuracy—outperforming existing federated long-tailed learning methods by a significant margin.
📝 Abstract
Effectively handling the co-occurrence of non-IID data and long-tailed distributions remains a critical challenge in federated learning. While fine-tuning vision-language models (VLMs) like CLIP has shown to be promising in addressing non-IID data challenges, this approach leads to severe degradation of tail classes in federated long-tailed scenarios. Under the composite effects of strong non-IID data distribution and long-tailed class imbalances, VLM fine-tuning may even fail to yield any improvement. To address this issue, we propose Class-Aware Prompt Learning for Federated Long-tailed Learning (CAPT), a novel framework that leverages a pre-trained VLM to effectively handle both data heterogeneity and long-tailed distributions. CAPT introduces a dual-prompt mechanism that synergizes general and class-aware prompts, enabling the framework to capture global trends while preserving class-specific knowledge. To better aggregate and share knowledge across clients, we introduce a heterogeneity-aware client clustering strategy that groups clients based on their data distributions, enabling efficient collaboration and knowledge sharing. Extensive experiments on various long-tailed datasets with different levels of data heterogeneity demonstrate that CAPT significantly improves tail class performance without compromising overall accuracy, outperforming state-of-the-art methods in federated long-tailed learning scenarios.