Green Federated Learning via Carbon-Aware Client and Time Slot Scheduling

📅 2025-09-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Federated learning (FL) incurs substantial carbon emissions due to distributed training across geographically dispersed clients, especially when powered by carbon-intensive electricity grids. Method: This paper proposes a carbon-aware client selection and training scheduling framework integrating elastic time windows, α-fair carbon quota allocation, and a global fine-tuning phase. It jointly models spatiotemporal heterogeneity of grid carbon intensity, client-specific delayed training mechanisms, and bias-compensated convergence optimization. Contribution/Results: To the best of our knowledge, this is the first work to co-model dynamic carbon intensity forecasting, elastic scheduling, and fair carbon resource allocation—balancing system efficiency and environmental equity. Experiments on real-world carbon intensity data demonstrate that, under diverse carbon budget constraints, our approach reduces carbon footprint by up to 38.2% while improving model accuracy by 1.7–4.3% over time-rigid baselines; gains are particularly pronounced under stringent carbon budgets.

Technology Category

Application Category

📝 Abstract
Training large-scale machine learning models incurs substantial carbon emissions. Federated Learning (FL), by distributing computation across geographically dispersed clients, offers a natural framework to leverage regional and temporal variations in Carbon Intensity (CI). This paper investigates how to reduce emissions in FL through carbon-aware client selection and training scheduling. We first quantify the emission savings of a carbon-aware scheduling policy that leverages slack time -- permitting a modest extension of the training duration so that clients can defer local training rounds to lower-carbon periods. We then examine the performance trade-offs of such scheduling which stem from statistical heterogeneity among clients, selection bias in participation, and temporal correlation in model updates. To leverage these trade-offs, we construct a carbon-aware scheduler that integrates slack time, $α$-fair carbon allocation, and a global fine-tuning phase. Experiments on real-world CI data show that our scheduler outperforms slack-agnostic baselines, achieving higher model accuracy across a wide range of carbon budgets, with especially strong gains under tight carbon constraints.
Problem

Research questions and friction points this paper is trying to address.

Reducing carbon emissions in federated learning training
Leveraging carbon intensity variations via client scheduling
Balancing emission savings with model performance trade-offs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Carbon-aware client selection scheduling
Slack time extension for low-carbon periods
Alpha-fair carbon allocation integration
🔎 Similar Papers
No similar papers found.
D
Daniel Richards Arputharaj
Inria Centre at Universit´e Cˆote d’Azur, France
C
Charlotte Rodriguez
Inria Centre at Universit´e Cˆote d’Azur, France; Accenture, France
Angelo Rodio
Angelo Rodio
Linköping University
Machine LearningFederated LearningOptimizationDistributed Systems
Giovanni Neglia
Giovanni Neglia
Inria Sophia Antipolis Méditerranée
computer networkssmart gridsmodelingperformance evaluation