Local Gradient Regulation Stabilizes Federated Learning under Client Heterogeneity

📅 2026-01-07
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of systematic drift in federated learning caused by distorted local gradients under client data heterogeneity, which impedes global convergence. To mitigate this issue, we propose an Exploration-Convergence Gradient Re-aggregation (ECGR) mechanism inspired by swarm intelligence, which dynamically analyzes and re-aggregates local gradients from the client perspective. ECGR suppresses unstable gradient components while preserving informative updates, without introducing additional communication overhead. The method is seamlessly integrable into existing federated learning frameworks. Both theoretical analysis and extensive experiments validate its effectiveness, demonstrating that ECGR significantly enhances the convergence stability and performance of mainstream federated algorithms in heterogeneous settings, particularly on medical imaging benchmarks such as LC25000.

Technology Category

Application Category

📝 Abstract
Federated learning (FL) enables collaborative model training across distributed clients without sharing raw data, yet its stability is fundamentally challenged by statistical heterogeneity in realistic deployments. Here, we show that client heterogeneity destabilizes FL primarily by distorting local gradient dynamics during client-side optimization, causing systematic drift that accumulates across communication rounds and impedes global convergence. This observation highlights local gradients as a key regulatory lever for stabilizing heterogeneous FL systems. Building on this insight, we develop a general client-side perspective that regulates local gradient contributions without incurring additional communication overhead. Inspired by swarm intelligence, we instantiate this perspective through Exploratory--Convergent Gradient Re-aggregation (ECGR), which balances well-aligned and misaligned gradient components to preserve informative updates while suppressing destabilizing effects. Theoretical analysis and extensive experiments, including evaluations on the LC25000 medical imaging dataset, demonstrate that regulating local gradient dynamics consistently stabilizes federated learning across state-of-the-art methods under heterogeneous data distributions.
Problem

Research questions and friction points this paper is trying to address.

Federated Learning
Client Heterogeneity
Gradient Dynamics
Statistical Heterogeneity
Convergence Stability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Local Gradient Regulation
Client Heterogeneity
Federated Learning Stability
Exploratory–Convergent Gradient Re-aggregation
Swarm Intelligence
🔎 Similar Papers
No similar papers found.