Coherence-Aware Over-the-Air Distributed Learning under Heterogeneous Link Impairments

📅 2026-03-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses downlink distortion and uplink over-the-air aggregation errors in wireless federated learning caused by heterogeneous device coherence times and bandwidth constraints. To tackle these challenges, the authors propose a coherence-aware joint communication-learning optimization framework. By partitioning OFDM superblocks into subblocks aligned with the shortest coherence time, reusing the global model of static devices as pilots for dynamic ones via product superposition, and leveraging prior local models to mitigate partial reception issues, the approach uniquely repurposes pilot overhead as effective payload. The method provides convergence guarantees under imperfect channel state information and aggregation noise, significantly improving communication efficiency, reducing latency, and achieving higher learning accuracy compared to conventional federated learning baselines.

Technology Category

Application Category

📝 Abstract
Distributed machine learning (ML) over wireless networks hinges on accurate channel state information (CSI) and efficient exchange of high-dimensional model updates. These demands are governed by channel coherence time and bandwidth, which vary across devices (links) due to heterogeneous mobility and scattering, causing degraded downlink delivery and distorted uplink over-the-air (OTA) aggregation. We propose a coherence-aware federated learning (FL) framework that jointly addresses impairments on downlink and uplink with communication-efficient strategies. In the downlink, we employ product superposition to multiplex global model symbols for long-coherence (static) devices onto the pilot tones required by short-coherence (dynamic) devices for channel estimation, turning pilot overhead into payload while preserving estimation fidelity. In the proposed scheme, an orthogonal frequency-division multiplexing (OFDM) super-block is partitioned into sub-blocks aligned with the smallest coherence time and bandwidth, enabling consistent channel estimation and stabilizing OTA aggregation across heterogeneous devices. Partial model reception at dynamic devices is mitigated via previous local model filling (PLMF), which reuses prior updates. We establish convergence guarantees under heterogeneous link impairments, imperfect CSI, and aggregation noise. The proposed framework enables efficient scheduling under coherence heterogeneity; analysis and experiments demonstrate notable gains in communication efficiency, latency, and learning accuracy over conventional FL baselines.
Problem

Research questions and friction points this paper is trying to address.

coherence time
heterogeneous links
over-the-air aggregation
distributed learning
channel state information
Innovation

Methods, ideas, or system contributions that make the work stand out.

coherence-aware
over-the-air aggregation
product superposition
federated learning
heterogeneous wireless links
🔎 Similar Papers
No similar papers found.
M
Mehdi Karbalayghareh
Department of Electrical and Computer Engineering, Purdue University, West Lafayette, IN 47907, USA
D
David J. Love
Department of Electrical and Computer Engineering, Purdue University, West Lafayette, IN 47907, USA
Christopher G. Brinton
Christopher G. Brinton
Elmore Associate Professor of ECE, Purdue University
NetworkingMachine LearningCommunicationsEdge ComputingNextG Wireless