ReBoot: Encrypted Training of Deep Neural Networks with CKKS Bootstrapping

📅 2025-06-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational overhead, severe noise accumulation, and poor scalability hindering fully homomorphic encrypted training of deep neural networks (DNNs), this paper proposes the first non-interactive, end-to-end homomorphic encryption (HE)-enabled DNN training framework. Methodologically, it introduces an HE-compatible network architecture leveraging localized error signals—eliminating global gradient aggregation—and integrates approximate bootstrapping with a customized CKKS bootstrapping mechanism, synergized with SIMD-based real-number packing and arithmetic optimizations to effectively suppress noise growth and reduce circuit depth. Evaluated on image and tabular datasets, the framework achieves accuracy comparable to 32-bit floating-point plaintext training: it outperforms encrypted logistic regression by 3.27% in accuracy and surpasses prior HE-DNN approaches by up to 6.83%, while reducing training latency by 8.83×. This work establishes a scalable, high-fidelity encrypted learning paradigm for privacy-sensitive machine learning-as-a-service (MLaaS) deployments.

Technology Category

Application Category

📝 Abstract
Growing concerns over data privacy underscore the need for deep learning methods capable of processing sensitive information without compromising confidentiality. Among privacy-enhancing technologies, Homomorphic Encryption (HE) stands out by providing post-quantum cryptographic security and end-to-end data protection, safeguarding data even during computation. While Deep Neural Networks (DNNs) have gained attention in HE settings, their use has largely been restricted to encrypted inference. Prior research on encrypted training has primarily focused on logistic regression or has relied on multi-party computation to enable model fine-tuning. This stems from the substantial computational overhead and algorithmic complexity involved in DNNs training under HE. In this paper, we present ReBoot, the first framework to enable fully encrypted and non-interactive training of DNNs. Built upon the CKKS scheme, ReBoot introduces a novel HE-compliant neural network architecture based on local error signals, specifically designed to minimize multiplicative depth and reduce noise accumulation. ReBoot employs a tailored packing strategy that leverages real-number arithmetic via SIMD operations, significantly lowering both computational and memory overhead. Furthermore, by integrating approximate bootstrapping, ReBoot learning algorithm supports effective training of arbitrarily deep multi-layer perceptrons, making it well-suited for machine learning as-a-service. ReBoot is evaluated on both image recognition and tabular benchmarks, achieving accuracy comparable to 32-bit floating-point plaintext training while enabling fully encrypted training. It improves test accuracy by up to +3.27% over encrypted logistic regression, and up to +6.83% over existing encrypted DNN frameworks, while reducing training latency by up to 8.83x. ReBoot is made available to the scientific community as a public repository.
Problem

Research questions and friction points this paper is trying to address.

Enabling fully encrypted training of deep neural networks
Reducing computational overhead in homomorphic encryption
Achieving accuracy comparable to plaintext training
Innovation

Methods, ideas, or system contributions that make the work stand out.

CKKS-based encrypted DNN training framework
HE-compliant neural network with local errors
Approximate bootstrapping for deep MLPs
🔎 Similar Papers
No similar papers found.