🤖 AI Summary
This work addresses the "rank collapse" phenomenon in heterogeneous federated low-rank adaptation (FedLoRA), where disparities in client resources and data distributions cause global update energy to concentrate on the smallest shared rank, degrading performance and increasing sensitivity to rank configuration. To mitigate this, the authors propose raFLoRA, which for the first time theoretically identifies and analyzes the root cause of rank collapse and introduces a rank-partitioned weighted aggregation mechanism. Specifically, local LoRA updates are decomposed into multiple rank-based partitions, and aggregation weights are dynamically assigned according to the effective contribution of clients within each partition, thereby overcoming the limitations of conventional rank-agnostic aggregation. Experiments demonstrate that raFLoRA significantly outperforms existing FedLoRA methods on both classification and reasoning tasks, effectively alleviating rank collapse and enhancing model performance while maintaining communication efficiency.
📝 Abstract
Federated low-rank adaptation (FedLoRA) has facilitated communication-efficient and privacy-preserving fine-tuning of foundation models for downstream tasks. In practical federated learning scenarios, client heterogeneity in system resources and data distributions motivates heterogeneous LoRA ranks across clients. We identify a previously overlooked phenomenon in heterogeneous FedLoRA, termed rank collapse, where the energy of the global update concentrates on the minimum shared rank, resulting in suboptimal performance and high sensitivity to rank configurations. Through theoretical analysis, we reveal the root cause of rank collapse: a mismatch between rank-agnostic aggregation weights and rank-dependent client contributions, which systematically suppresses higher-rank updates at a geometric rate over rounds. Motivated by this insight, we propose raFLoRA, a rank-partitioned aggregation method that decomposes local updates into rank partitions and then aggregates each partition weighted by its effective client contributions. Extensive experiments across classification and reasoning tasks show that raFLoRA prevents rank collapse, improves model performance, and preserves communication efficiency compared to state-of-the-art FedLoRA baselines.