FedSGM: A Unified Framework for Constraint Aware, Bidirectionally Compressed, Multi-Step Federated Optimization

📅 2026-01-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses four major challenges in federated learning—functional constraints, communication compression, multi-step local updates, and partial client participation—by proposing a unified optimization framework that avoids projections or dual variables. The method employs a switching gradient mechanism combined with bidirectional error feedback to mitigate compression-induced noise and introduces a soft switching strategy to stabilize updates near the boundary of the feasible region. To the best of our knowledge, this is the first approach to jointly handle all four challenges within a single framework. Theoretically, the algorithm achieves a convergence rate of $O(1/\sqrt{T})$ with high probability. Empirical evaluations on Neyman-Pearson classification and constrained Markov decision process tasks demonstrate its effectiveness and practicality.

Technology Category

Application Category

📝 Abstract
We introduce FedSGM, a unified framework for federated constrained optimization that addresses four major challenges in federated learning (FL): functional constraints, communication bottlenecks, local updates, and partial client participation. Building on the switching gradient method, FedSGM provides projection-free, primal-only updates, avoiding expensive dual-variable tuning or inner solvers. To handle communication limits, FedSGM incorporates bi-directional error feedback, correcting the bias introduced by compression while explicitly understanding the interaction between compression noise and multi-step local updates. We derive convergence guarantees showing that the averaged iterate achieves the canonical $\boldsymbol{\mathcal{O}}(1/\sqrt{T})$ rate, with additional high-probability bounds that decouple optimization progress from sampling noise due to partial participation. Additionally, we introduce a soft switching version of FedSGM to stabilize updates near the feasibility boundary. To our knowledge, FedSGM is the first framework to unify functional constraints, compression, multiple local updates, and partial client participation, establishing a theoretically grounded foundation for constrained federated learning. Finally, we validate the theoretical guarantees of FedSGM via experimentation on Neyman-Pearson classification and constrained Markov decision process (CMDP) tasks.
Problem

Research questions and friction points this paper is trying to address.

federated learning
functional constraints
communication compression
partial client participation
local updates
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated Constrained Optimization
Bidirectional Compression
Switching Gradient Method
Error Feedback
Partial Client Participation
🔎 Similar Papers
No similar papers found.
A
Antesh Upadhyay
School of Electrical and Computer Engineering, Purdue University
S
Sang Bin Moon
School of Electrical and Computer Engineering, Purdue University
Abolfazl Hashemi
Abolfazl Hashemi
Assistant Professor of ECE, Purdue University
Large-Scale Optimization