Understanding the Resource Cost of Fully Homomorphic Encryption in Quantum Federated Learning

📅 2026-03-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the privacy risks posed by gradient leakage in quantum federated learning and investigates the practical overhead of fully homomorphic encryption (FHE) as a mitigation strategy. For the first time, we integrate the CKKS FHE scheme into a federated learning framework to protect parameters of quantum machine learning models—specifically quantum convolutional neural networks (QCNNs)—and apply this approach to brain tumor classification using MRI images. Through systematic experiments, we evaluate the computational, memory, and communication overheads introduced by FHE, revealing a critical trade-off between privacy preservation and model performance: reducing model parameter precision alleviates overhead but degrades classification accuracy, thereby highlighting the inherent tension among privacy, efficiency, and model complexity.

Technology Category

Application Category

📝 Abstract
Quantum Federated Learning (QFL) enables distributed training of Quantum Machine Learning (QML) models by sharing model gradients instead of raw data. However, these gradients can still expose sensitive user information. To enhance privacy, homomorphic encryption of parameters has been proposed as a solution in QFL and related frameworks. In this work, we evaluate the overhead introduced by Fully Homomorphic Encryption (FHE) in QFL setups and assess its feasibility for real-world applications. We implemented various QML models including a Quantum Convolutional Neural Network (QCNN) trained in a federated environment with parameters encrypted using the CKKS scheme. This work marks the first QCNN trained in a federated setting with CKKS-encrypted parameters. Models of varying architectures were trained to predict brain tumors from MRI scans. The experiments reveal that memory and communication overhead remain substantial, making FHE challenging to deploy. Minimizing overhead requires reducing the number of model parameters, which, however, leads to a decline in classification performance, introducing a trade-off between privacy and model complexity.
Problem

Research questions and friction points this paper is trying to address.

Fully Homomorphic Encryption
Quantum Federated Learning
Resource Overhead
Privacy-Utility Trade-off
CKKS
Innovation

Methods, ideas, or system contributions that make the work stand out.

Fully Homomorphic Encryption
Quantum Federated Learning
CKKS
Quantum Convolutional Neural Network
Privacy-Utility Trade-off
🔎 Similar Papers
No similar papers found.
L
Lukas Böhm
Dept. of Computer Science, Leipzig University; Center for Scalable Data Analytics and Artificial Intelligence (ScaDS.AI) Dresden/Leipzig, Leipzig University
A
Arjhun Swaminathan
Medical Data Privacy and Privacy-preserving Machine Learning (MDPPML), University of Tübingen; Institute for Bioinformatics and Medical Informatics (IBMI), University of Tübingen
A
Anika Hannemann
Dept. of Computer Science, Leipzig University; School of Engineering, Zurich University of Applied Sciences; Center for Scalable Data Analytics and Artificial Intelligence (ScaDS.AI) Dresden/Leipzig, Leipzig University
Erik Buchmann
Erik Buchmann
Leipzig University / ScaDS.AI
PrivacySecurityMachine LearningArtificial IntelligenceSmart Home Technologies