Sporadic Gradient Tracking over Directed Graphs: A Theoretical Perspective on Decentralized Federated Learning

📅 2026-01-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges of collaborative training in decentralized federated learning under data heterogeneity and heterogeneous client resources, such as directed communication topologies and sporadic participation. To this end, the authors propose Sporadic Gradient Tracking (Spod-GT), an algorithm that, for the first time, integrates gradient tracking with stochastic intermittent participation over directed graphs. Spod-GT allows clients to customize their computation and communication frequencies while providing convergence guarantees under milder assumptions on gradient variance and diversity. The theoretical analysis jointly models directed-graph communication and random client scheduling. Experimental results on image classification tasks demonstrate that Spod-GT significantly outperforms existing gradient tracking baselines, exhibiting superior convergence speed and robustness.

Technology Category

Application Category

📝 Abstract
Decentralized Federated Learning (DFL) enables clients with local data to collaborate in a peer-to-peer manner to train a generalized model. In this paper, we unify two branches of work that have separately solved important challenges in DFL: (i) gradient tracking techniques for mitigating data heterogeneity and (ii) accounting for diverse availability of resources across clients. We propose $\textit{Sporadic Gradient Tracking}$ ($\texttt{Spod-GT}$), the first DFL algorithm that incorporates these factors over general directed graphs by allowing (i) client-specific gradient computation frequencies and (ii) heterogeneous and asymmetric communication frequencies. We conduct a rigorous convergence analysis of our methodology with relaxed assumptions on gradient estimation variance and gradient diversity of clients, providing consensus and optimality guarantees for GT over directed graphs despite intermittent client participation. Through numerical experiments on image classification datasets, we demonstrate the efficacy of $\texttt{Spod-GT}$ compared to well-known GT baselines.
Problem

Research questions and friction points this paper is trying to address.

Decentralized Federated Learning
Gradient Tracking
Directed Graphs
Data Heterogeneity
Client Availability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sporadic Gradient Tracking
Decentralized Federated Learning
Directed Graphs
Heterogeneous Communication
Gradient Tracking
🔎 Similar Papers
No similar papers found.
S
Shahryar Zehtabi
School of Electrical and Computer Engineering, Purdue University, West Lafayette, IN, USA
Dong-Jun Han
Dong-Jun Han
Assistant Professor, Yonsei University
Edge AIOn-Device AIFederated LearningDistributed Machine LearningWireless Network
S
Seyyedali Hosseinalipour
Department of Electrical Engineering, University at Buffalo – SUNY, Buffalo, NY, USA
C
Christopher Brinton
School of Electrical and Computer Engineering, Purdue University, West Lafayette, IN, USA