DP-FEDSOFIM: Differentially Private Federated Stochastic Optimization using Regularized Fisher Information Matrix

📅 2026-01-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge in differentially private federated learning where stringent privacy budgets necessitate large injected noise, severely slowing the convergence of first-order methods, while existing second-order approaches are hindered by prohibitive memory costs in high-dimensional models. To overcome this, the authors propose a server-side second-order optimization framework that constructs a natural gradient preconditioner using the Fisher information matrix and leverages the Sherman-Morrison formula for efficient matrix inversion, requiring only O(d) memory and computational complexity per client. This approach is the first to enable scalable second-order optimization under (ε,δ)-differential privacy, effectively balancing privacy guarantees with convergence efficiency. Experiments on CIFAR-10 demonstrate that the method consistently achieves significantly higher test accuracy than first-order baselines across various privacy budgets.

Technology Category

Application Category

📝 Abstract
Differentially private federated learning (DP-FL) suffers from slow convergence under tight privacy budgets due to the overwhelming noise introduced to preserve privacy. While adaptive optimizers can accelerate convergence, existing second-order methods such as DP-FedNew require O(d^2) memory at each client to maintain local feature covariance matrices, making them impractical for high-dimensional models. We propose DP-FedSOFIM, a server-side second-order optimization framework that leverages the Fisher Information Matrix (FIM) as a natural gradient preconditioner while requiring only O(d) memory per client. By employing the Sherman-Morrison formula for efficient matrix inversion, DP-FedSOFIM achieves O(d) computational complexity per round while maintaining the convergence benefits of second-order methods. Our analysis proves that the server-side preconditioning preserves (epsilon, delta)-differential privacy through the post-processing theorem. Empirical evaluation on CIFAR-10 demonstrates that DP-FedSOFIM achieves superior test accuracy compared to first-order baselines across multiple privacy regimes.
Problem

Research questions and friction points this paper is trying to address.

differentially private federated learning
slow convergence
high-dimensional models
memory complexity
privacy-preserving optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Differentially Private Federated Learning
Fisher Information Matrix
Second-order Optimization
Sherman-Morrison Formula
Memory Efficiency
🔎 Similar Papers
No similar papers found.
S
Sidhant R. Nair
Department of Mechanical Engineering, Indian Institute of Technology Delhi, New Delhi, India
T
Tanmay Sen
SQC & OR Unit, Indian Statistical Institute Kolkata, Kolkata, India
Mrinmay Sen
Mrinmay Sen
Joint PhD, Dept. of AI, IIT Hyderabad and Dept. Computing Technologies, SUT Melbourne
Optimisation in machine learning and deep learningFederated learningComputer Vision