🤖 AI Summary
Quantum machine learning (QML) poses privacy risks due to memorization of sensitive training data, yet existing quantum differential privacy (QDP) mechanisms lack empirical auditing tools. To address this gap, we propose the first black-box privacy auditing framework tailored for QML. Our method leverages quantum decoy-state encoding and trace-distance bound analysis to establish a rigorous mathematical relationship between decoy bias magnitude and privacy budget consumption, thereby deriving a measurable lower bound on privacy leakage. We further design a cross-platform (quantum simulator and superconducting hardware) black-box query protocol enabling end-to-end empirical validation. Evaluated across multiple QML models, our framework accurately quantifies actual privacy leakage during training; experimental results closely match theoretical bounds, effectively bridging the gap between formal privacy guarantees and empirical assessment.
📝 Abstract
Quantum machine learning (QML) promises significant computational advantages, yet models trained on sensitive data risk memorizing individual records, creating serious privacy vulnerabilities. While Quantum Differential Privacy (QDP) mechanisms provide theoretical worst-case guarantees, they critically lack empirical verification tools for deployed models. We introduce the first black-box privacy auditing framework for QML based on Lifted Quantum Differential Privacy, leveraging quantum canaries (strategically offset-encoded quantum states) to detect memorization and precisely quantify privacy leakage during training. Our framework establishes a rigorous mathematical connection between canary offset and trace distance bounds, deriving empirical lower bounds on privacy budget consumption that bridge the critical gap between theoretical guarantees and practical privacy verification. Comprehensive evaluations across both simulated and physical quantum hardware demonstrate our framework's effectiveness in measuring actual privacy loss in QML models, enabling robust privacy verification in QML systems.