Rethinking LoRA for Data Heterogeneous Federated Learning: Subspace and State Alignment

๐Ÿ“… 2026-02-02
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
In non-IID federated learning, Low-Rank Adaptation (LoRA) suffers significant performance degradation due to mismatches between client update subspaces and the server aggregation space, as well as desynchronization of optimizer states. This work proposes FedGaLore, a novel framework that, for the first time, identifies and decouples these two sources of mismatch. It achieves gradient subspace alignment through spectral analysisโ€“driven shared signal extraction and introduces a projected second-moment state synchronization mechanism. By integrating GaLore-style gradient optimization with robust state synchronization, FedGaLore substantially improves the accuracy and robustness of LoRA under data heterogeneity across multiple benchmarks in natural language understanding, vision, and natural language generation, outperforming existing state-of-the-art methods.

Technology Category

Application Category

๐Ÿ“ Abstract
Low-Rank Adaptation (LoRA) is widely used for federated fine-tuning. Yet under non-IID settings, it can substantially underperform full-parameter fine-tuning. Through with-high-probability robustness analysis, we uncover that this gap can be attributed to two coupled mismatches: (i) update-space mismatch, where clients optimize in a low-rank subspace but aggregation occurs in the full space; and (ii) optimizer-state mismatch, where unsynchronized adaptive states amplify drift across rounds. We propose FedGaLore, which combines client-side GaLore-style gradient-subspace optimization with server-side drift-robust synchronization of projected second-moment states via spectral shared-signal extraction, to address this challenge. Across NLU, vision, and NLG benchmarks, FedGaLore improves robustness and accuracy over state-of-the-art federated LoRA baselines in non-IID settings.
Problem

Research questions and friction points this paper is trying to address.

Federated Learning
LoRA
non-IID
Update-space Mismatch
Optimizer-state Mismatch
Innovation

Methods, ideas, or system contributions that make the work stand out.

LoRA
Federated Learning
non-IID
Subspace Optimization
Optimizer State Synchronization
๐Ÿ”Ž Similar Papers
No similar papers found.