🤖 AI Summary
To address the privacy-preserving computation bottleneck—namely, low training efficiency and slow convergence of logistic regression on encrypted data—this paper proposes an efficient and secure training framework tailored for fully homomorphic encryption (FHE). Methodologically, it introduces, for the first time, a quadratic gradient variant that seamlessly integrates first-order optimization with second-order curvature information; designs an FHE-adapted enhanced Nesterov Accelerated Gradient (NAG) algorithm; and synergistically incorporates Adagrad and Adam mechanisms. Experiments across multiple standard benchmark datasets demonstrate that the proposed method achieves convergence to comparable accuracy within only four iterations—reducing iteration count by over 70% relative to state-of-the-art FHE-based logistic regression approaches. This yields substantial improvements in training speed and practicality under strict privacy guarantees.
📝 Abstract
Training logistic regression over encrypted data has been a compelling approach in addressing security concerns for several years. In this paper, we introduce an efficient gradient variant, called $quadratic$ $gradient$, for privacy-preserving logistic regression training. We enhance Nesterov's Accelerated Gradient (NAG), Adaptive Gradient Algorithm (Adagrad) and Adam algorithms by incorporating their quadratic gradients and evaluate these improved algorithms on various datasets. Experimental results demonstrate that the enhanced algorithms achieve significantly improved convergence speed compared to traditional first-order gradient methods. Moreover, we applied the enhanced NAG method to implement homomorphic logistic regression training, achieving comparable results within just 4 iterations. There is a great chance that the quadratic gradient approach could integrate first-order gradient descent/ascent algorithms with the second-order Newton-Raphson methods, and that it could be applied to a wide range of numerical optimization problems.