Beyond Fixed Rounds: Data-Free Early Stopping for Practical Federated Learning

📅 2026-01-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the high computational overhead and privacy risks associated with conventional early stopping mechanisms in federated learning, which typically rely on fixed communication rounds or access to validation data. The authors propose the first validation-free early stopping framework that dynamically determines the optimal stopping round by monitoring the growth rate of task vectors at the server side. This approach is data-agnostic and compatible with mainstream federated learning algorithms. Experimental results on skin lesion and blood cell classification tasks demonstrate that the method achieves superior performance—improving accuracy by 12.5% and 10.3%, respectively—while requiring only an average of 47 and 20 communication rounds. The framework thus effectively balances training efficiency and privacy preservation.

Technology Category

Application Category

📝 Abstract
Federated Learning (FL) facilitates decentralized collaborative learning without transmitting raw data. However, reliance on fixed global rounds or validation data for hyperparameter tuning hinders practical deployment by incurring high computational costs and privacy risks. To address this, we propose a data-free early stopping framework that determines the optimal stopping point by monitoring the task vector's growth rate using solely server-side parameters. The numerical results on skin lesion/blood cell classification demonstrate that our approach is comparable to validation-based early stopping across various state-of-the-art FL methods. In particular, the proposed framework spends an average of 47/20 (skin lesion/blood cell) rounds to achieve over 12.5%/10.3% higher performance than early stopping based on validation data. To the best of our knowledge, this is the first work to propose an early stopping framework for FL methods without using any validation data.
Problem

Research questions and friction points this paper is trying to address.

Federated Learning
early stopping
validation data
hyperparameter tuning
privacy risk
Innovation

Methods, ideas, or system contributions that make the work stand out.

data-free early stopping
federated learning
task vector growth rate
server-side parameters
validation-free
🔎 Similar Papers
No similar papers found.