pFedBBN: A Personalized Federated Test-Time Adaptation with Balanced Batch Normalization for Class-Imbalanced Data

📅 2025-11-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In federated learning, test-time adaptation (TTA) faces dual challenges of class imbalance (CI) and domain shift, particularly under unsupervised and privacy-preserving constraints. To address this, we propose pFedBBN, a personalized federated TTA framework that introduces Balanced Batch Normalization (BBN) into the federated TTA setting for the first time. pFedBBN enables label-free, decentralized local adaptation via class-aware model aggregation and a lightweight, feature-similarity-driven collaboration mechanism—without cross-client data sharing. It ensures both fairness and robustness while preserving client privacy. Extensive experiments on multiple benchmarks demonstrate that pFedBBN significantly improves minority-class accuracy and overall generalization performance, outperforming existing federated learning and TTA methods.

Technology Category

Application Category

📝 Abstract
Test-time adaptation (TTA) in federated learning (FL) is crucial for handling unseen data distributions across clients, particularly when faced with domain shifts and skewed class distributions. Class Imbalance (CI) remains a fundamental challenge in FL, where rare but critical classes are often severely underrepresented in individual client datasets. Although prior work has addressed CI during training through reliable aggregation and local class distribution alignment, these methods typically rely on access to labeled data or coordination among clients, and none address class unsupervised adaptation to dynamic domains or distribution shifts at inference time under federated CI constraints. Revealing the failure of state-of-the-art TTA in federated client adaptation in CI scenario, we propose pFedBBN,a personalized federated test-time adaptation framework that employs balanced batch normalization (BBN) during local client adaptation to mitigate prediction bias by treating all classes equally, while also enabling client collaboration guided by BBN similarity, ensuring that clients with similar balanced representations reinforce each other and that adaptation remains aligned with domain-specific characteristics. pFedBBN supports fully unsupervised local adaptation and introduces a class-aware model aggregation strategy that enables personalized inference without compromising privacy. It addresses both distribution shifts and class imbalance through balanced feature normalization and domain-aware collaboration, without requiring any labeled or raw data from clients. Extensive experiments across diverse baselines show that pFedBBN consistently enhances robustness and minority-class performance over state-of-the-art FL and TTA methods.
Problem

Research questions and friction points this paper is trying to address.

Addresses class imbalance in federated learning with test-time adaptation
Mitigates prediction bias using balanced batch normalization for all classes
Enables unsupervised local adaptation without client data sharing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Balanced batch normalization mitigates class imbalance bias
Client collaboration guided by BBN similarity enhances adaptation
Class-aware aggregation enables personalized privacy-preserving inference
🔎 Similar Papers
No similar papers found.
M
Md Akil Raihan Iftee
Center for Computational & Data Sciences (CCDS), Independent University, Bangladesh
S
Syed Md. Ahnaf Hasan
Center for Computational & Data Sciences (CCDS), Independent University, Bangladesh
M
Mir Sazzat Hossain
Center for Computational & Data Sciences (CCDS), Independent University, Bangladesh
R
Rakibul Hasan Rajib
University of Central Florida, USA
Amin Ahsan Ali
Amin Ahsan Ali
Independent University, Bangladesh
Machine LearningData SciencemHealth
A
AKM Mahbubur Rahman
Center for Computational & Data Sciences (CCDS), Independent University, Bangladesh
S
Sajib Mistry
Curtin University, Australia
Monowar Bhuyan
Monowar Bhuyan
Associate Professor & WASP Fellow, Umeå University, Sweden.
Machine learningAnomaly detectionSystems and AI securityDistributed systems