🤖 AI Summary
This work addresses the privacy risks posed by gradient leakage in quantum federated learning and investigates the practical overhead of fully homomorphic encryption (FHE) as a mitigation strategy. For the first time, we integrate the CKKS FHE scheme into a federated learning framework to protect parameters of quantum machine learning models—specifically quantum convolutional neural networks (QCNNs)—and apply this approach to brain tumor classification using MRI images. Through systematic experiments, we evaluate the computational, memory, and communication overheads introduced by FHE, revealing a critical trade-off between privacy preservation and model performance: reducing model parameter precision alleviates overhead but degrades classification accuracy, thereby highlighting the inherent tension among privacy, efficiency, and model complexity.
📝 Abstract
Quantum Federated Learning (QFL) enables distributed training of Quantum Machine Learning (QML) models by sharing model gradients instead of raw data. However, these gradients can still expose sensitive user information. To enhance privacy, homomorphic encryption of parameters has been proposed as a solution in QFL and related frameworks. In this work, we evaluate the overhead introduced by Fully Homomorphic Encryption (FHE) in QFL setups and assess its feasibility for real-world applications. We implemented various QML models including a Quantum Convolutional Neural Network (QCNN) trained in a federated environment with parameters encrypted using the CKKS scheme. This work marks the first QCNN trained in a federated setting with CKKS-encrypted parameters. Models of varying architectures were trained to predict brain tumors from MRI scans. The experiments reveal that memory and communication overhead remain substantial, making FHE challenging to deploy. Minimizing overhead requires reducing the number of model parameters, which, however, leads to a decline in classification performance, introducing a trade-off between privacy and model complexity.