๐ค AI Summary
This work addresses the challenges of high computational overhead, numerical instability, and excessive communication costs that plague existing second-order optimization methods in non-IID federated learning. To overcome these limitations, the authors propose FedRCO, a novel framework that integrates efficient curvature approximation, real-time gradient anomaly monitoring, a fail-safe state reset mechanism, and a curvature-preserving adaptive aggregation strategy. This design effectively retains local curvature information while ensuring numerical stability. Experimental results demonstrate that FedRCO consistently outperforms state-of-the-art first- and second-order methods across diverse non-IID settings, achieving higher model accuracy, faster convergence, and reduced communication overhead.
๐ Abstract
In this paper, we present Federated Robust Curvature Optimization (FedRCO), a novel second-order optimization framework designed to improve convergence speed and reduce communication cost in Federated Learning systems under statistical heterogeneity. Existing second-order optimization methods are often computationally expensive and numerically unstable in distributed settings. In contrast, FedRCO addresses these challenges by integrating an efficient approximate curvature optimizer with a provable stability mechanism. Specifically, FedRCO incorporates three key components: (1) a Gradient Anomaly Monitor that detects and mitigates exploding gradients in real-time, (2) a Fail-Safe Resilience protocol that resets optimization states upon numerical instability, and (3) a Curvature-Preserving Adaptive Aggregation strategy that safely integrates global knowledge without erasing the local curvature geometry. Theoretical analysis shows that FedRCO can effectively mitigate instability and prevent unbounded updates while preserving optimization efficiency. Extensive experiments show that FedRCO achieves superior robustness against diverse non-IID scenarios while achieving higher accuracy and faster convergence than both state-of-the-art first-order and second-order methods.