FedPURIN: Programmed Update and Reduced INformation for Sparse Personalized Federated Learning

πŸ“… 2025-10-16
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the high communication overhead and deployment constraints in personalized federated learning (PFL) caused by statistical heterogeneity, this paper proposes FedPURINβ€”a novel framework that introduces integer programming into the PFL parameter update mechanism for the first time. By modeling and optimizing parameter decoupling, FedPURIN precisely identifies critical parameter subsets, enabling programmable sparse updates and aggregation. Its core innovation lies in establishing a sparsification paradigm that jointly optimizes communication efficiency and personalized model performance. Evaluated on standard image classification benchmarks under non-IID data settings, FedPURIN achieves state-of-the-art (SOTA) model accuracy while significantly reducing communication volume. Experimental results demonstrate its practicality and scalability in edge intelligence scenarios, where bandwidth and device resource constraints are stringent.

Technology Category

Application Category

πŸ“ Abstract
Personalized Federated Learning (PFL) has emerged as a critical research frontier addressing data heterogeneity issue across distributed clients. Novel model architectures and collaboration mechanisms are engineered to accommodate statistical disparities while producing client-specific models. Parameter decoupling represents a promising paradigm for maintaining model performance in PFL frameworks. However, the communication efficiency of many existing methods remains suboptimal, sustaining substantial communication burdens that impede practical deployment. To bridge this gap, we propose Federated Learning with Programmed Update and Reduced INformation (FedPURIN), a novel framework that strategically identifies critical parameters for transmission through an integer programming formulation. This mathematically grounded strategy is seamlessly integrated into a sparse aggregation scheme, achieving a significant communication reduction while preserving the efficacy. Comprehensive evaluations on standard image classification benchmarks under varied non-IID conditions demonstrate competitive performance relative to state-of-the-art methods, coupled with quantifiable communication reduction through sparse aggregation. The framework establishes a new paradigm for communication-efficient PFL, particularly advantageous for edge intelligence systems operating with heterogeneous data sources.
Problem

Research questions and friction points this paper is trying to address.

Addressing data heterogeneity in personalized federated learning
Reducing communication burdens through sparse parameter transmission
Improving efficiency for edge intelligence with heterogeneous data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Programmed parameter selection via integer programming
Sparse aggregation scheme reduces communication burden
Maintains model efficacy with quantifiable communication reduction
πŸ”Ž Similar Papers
No similar papers found.
Lunchen Xie
Lunchen Xie
School of Computer Science and Technology, Tongji University
distributed machine learningfederated learningneural architecture search
Z
Zehua He
School of Computer Science and Technology, Tongji University, No.4800 Cao'an Hwy, Jiading District, Shanghai, 201804, China
Q
Qingjiang Shi
School of Computer Science and Technology, Tongji University, No.4800 Cao'an Hwy, Jiading District, Shanghai, 201804, China; Shenzhen Research Institute of Big Data, 2001 Longxiang Avenue, Longgang District, Shenzhen, 518172, China