Guaranteeing Privacy in Hybrid Quantum Learning through Theoretical Mechanisms

πŸ“… 2026-02-02
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenge of reconciling rigorous privacy guarantees with model utility in quantum machine learning. It proposes HYPER-Q, a novel mechanism that systematically integrates intrinsic quantum noise with classical differential privacy to provide theoretically provable privacy protection for hybrid quantum learning models under the (Ξ΅, Ξ΄)-differential privacy framework. By co-designing a hybrid noise injection strategy, HYPER-Q establishes theoretical bounds on the privacy-utility trade-off, substantially mitigating the performance degradation typically induced by privacy-preserving mechanisms. Empirical evaluations on multiple real-world datasets demonstrate that HYPER-Q outperforms purely classical noise-based approaches, achieving both enhanced adversarial robustness and a superior balance between privacy preservation and model utility.

Technology Category

Application Category

πŸ“ Abstract
Quantum Machine Learning (QML) is becoming increasingly prevalent due to its potential to enhance classical machine learning (ML) tasks, such as classification. Although quantum noise is often viewed as a major challenge in quantum computing, it also offers a unique opportunity to enhance privacy. In particular, intrinsic quantum noise provides a natural stochastic resource that, when rigorously analyzed within the differential privacy (DP) framework and composed with classical mechanisms, can satisfy formal $(\varepsilon, \delta)$-DP guarantees. This enables a reduction in the required classical perturbation without compromising the privacy budget, potentially improving model utility. However, the integration of classical and quantum noise for privacy preservation remains unexplored. In this work, we propose a hybrid noise-added mechanism, HYPER-Q, that combines classical and quantum noise to protect the privacy of QML models. We provide a comprehensive analysis of its privacy guarantees and establish theoretical bounds on its utility. Empirically, we demonstrate that HYPER-Q outperforms existing classical noise-based mechanisms in terms of adversarial robustness across multiple real-world datasets.
Problem

Research questions and friction points this paper is trying to address.

Quantum Machine Learning
Differential Privacy
Quantum Noise
Hybrid Privacy Mechanism
Privacy Preservation
Innovation

Methods, ideas, or system contributions that make the work stand out.

hybrid noise
quantum machine learning
differential privacy
quantum noise
adversarial robustness
πŸ”Ž Similar Papers
No similar papers found.
H
Hoang M. Ngo
Department of Computer & Information Science & Engineering, University of Florida, Gainesville, Florida, USA
T
Tre’ R. Jeter
Department of Computer & Information Science & Engineering, University of Florida, Gainesville, Florida, USA
Incheol Shin
Incheol Shin
Professor of Life Science, Hanyang University
cell biology
W
Wanli Xing
Department of Education, University of Florida, Gainesville, Florida, USA
T
T.M. Kahveci
Department of Computer & Information Science & Engineering, University of Florida, Gainesville, Florida, USA
My T. Thai
My T. Thai
Professor, University of Florida, IEEE Fellow
Explainable AISecurity and PrivacyNetwork ScienceOptimization