Personalized Federated Learning via Gaussian Generative Modeling

📅 2026-03-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the lack of representation-level personalization in federated learning caused by heterogeneous client data by proposing pFedGM, a novel personalized federated learning approach. pFedGM introduces Gaussian generative models into personalized federated learning for the first time, employing a decoupled architecture comprising a navigator and a statistical extractor to disentangle global features from local data distributions. Inspired by Kalman gain, it establishes a dual-scale fusion framework that enables Bayesian inference–driven learning of personalized classification heads. By integrating weighted resampling with a dual-objective optimization strategy—maximizing inter-class distances while minimizing intra-class distances—pFedGM achieves state-of-the-art or competitive performance across diverse heterogeneity settings, including variations in class cardinality and environmental noise, on multiple benchmark datasets.

Technology Category

Application Category

📝 Abstract
Federated learning has emerged as a paradigm to train models collaboratively on inherently distributed client data while safeguarding privacy. In this context, personalized federated learning tackles the challenge of data heterogeneity by equipping each client with a dedicated model. A prevalent strategy decouples the model into a shared feature extractor and a personalized classifier head, where the latter actively guides the representation learning. However, previous works have focused on classifier head-guided personalization, neglecting the potential personalized characteristics in the representation distribution. Building on this insight, we propose pFedGM, a method based on Gaussian generative modeling. The approach begins by training a Gaussian generator that models client heterogeneity via weighted re-sampling. A balance between global collaboration and personalization is then struck by employing a dual objective: a shared objective that maximizes inter-class distance across clients, and a local objective that minimizes intra-class distance within them. To achieve this, we decouple the conventional Gaussian classifier into a navigator for global optimization, and a statistic extractor for capturing distributional statistics. Inspired by the Kalman gain, the algorithm then employs a dual-scale fusion framework at global and local levels to equip each client with a personalized classifier head. In this framework, we model the global representation distribution as a prior and the client-specific data as the likelihood, enabling Bayesian inference for class probability estimation. The evaluation covers a comprehensive range of scenarios: heterogeneity in class counts, environmental corruption, and multiple benchmark datasets and configurations. pFedGM achieves superior or competitive performance compared to state-of-the-art methods.
Problem

Research questions and friction points this paper is trying to address.

Personalized Federated Learning
Data Heterogeneity
Representation Distribution
Gaussian Generative Modeling
Client-Specific Personalization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Personalized Federated Learning
Gaussian Generative Modeling
Representation Distribution
Dual-scale Fusion
Bayesian Inference
🔎 Similar Papers
No similar papers found.