FedVG: Gradient-Guided Aggregation for Enhanced Federated Learning

📅 2026-02-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes FedVG, a novel federated learning framework designed to mitigate model drift and degraded generalization caused by client data heterogeneity. FedVG introduces a global public validation set and leverages gradient norms computed on this set across individual network layers to construct client-specific scores for adaptive aggregation. Unlike conventional approaches that rely on local data volume, FedVG pioneers a layer-wise evaluation mechanism based on global validation gradients, significantly enhancing model generalization under heterogeneous conditions. Experimental results demonstrate that FedVG consistently outperforms existing baselines on both natural and medical image datasets, with particularly pronounced gains in highly heterogeneous settings, and effectively boosts the performance of mainstream federated learning algorithms.

Technology Category

Application Category

📝 Abstract
Federated Learning (FL) enables collaborative model training across multiple clients without sharing their private data. However, data heterogeneity across clients leads to client drift, which degrades the overall generalization performance of the model. This effect is further compounded by overemphasis on poorly performing clients. To address this problem, we propose FedVG, a novel gradient-based federated aggregation framework that leverages a global validation set to guide the optimization process. Such a global validation set can be established using readily available public datasets, ensuring accessibility and consistency across clients without compromising privacy. In contrast to conventional approaches that prioritize client dataset volume, FedVG assesses the generalization ability of client models by measuring the magnitude of validation gradients across layers. Specifically, we compute layerwise gradient norms to derive a client-specific score that reflects how much each client needs to adjust for improved generalization on the global validation set, thereby enabling more informed and adaptive federated aggregation. Extensive experiments on both natural and medical image benchmarking datasets, across diverse model architectures, demonstrate that FedVG consistently improves performance, particularly in highly heterogeneous settings. Moreover, FedVG is modular and can be seamlessly integrated with various state-of-the-art FL algorithms, often further improving their results. Our code is available at https://github.com/alinadevkota/FedVG.
Problem

Research questions and friction points this paper is trying to address.

Federated Learning
data heterogeneity
client drift
generalization performance
gradient-based aggregation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated Learning
Gradient-Guided Aggregation
Data Heterogeneity
Global Validation Set
Client Drift
🔎 Similar Papers
No similar papers found.