Hierarchy of discriminative power and complexity in learning quantum ensembles

📅 2026-01-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the lack of a distance measure between quantum state ensembles that simultaneously offers strong discriminative power and statistical efficiency. The authors propose the MMD-k hierarchy, extending Maximum Mean Discrepancy to quantum ensembles, and establish—for the first time—a rigorous trade-off between discriminative capability and sample complexity as a function of the moment order \(k\). Theoretical analysis shows that for \(N\) pure states, MMD-\(k\) requires \(\Theta(N^{2-2/k})\) samples for fixed \(k\), while achieving full discriminative power (at \(k = N\)) demands \(\Theta(N^3)\) samples. In contrast, the quantum Wasserstein distance attains full discriminative power with only \(\Theta(N^2 \log N)\) samples. An MMD-k estimator based on the SWAP test is also presented, offering practical guidance for designing loss functions in quantum machine learning.

Technology Category

Application Category

📝 Abstract
Distance metrics are central to machine learning, yet distances between ensembles of quantum states remain poorly understood due to fundamental quantum measurement constraints. We introduce a hierarchy of integral probability metrics, termed MMD-$k$, which generalizes the maximum mean discrepancy to quantum ensembles and exhibit a strict trade-off between discriminative power and statistical efficiency as the moment order $k$ increases. For pure-state ensembles of size $N$, estimating MMD-$k$ using experimentally feasible SWAP-test-based estimators requires $\Theta(N^{2-2/k})$ samples for constant $k$, and $\Theta(N^3)$ samples to achieve full discriminative power at $k = N$. In contrast, the quantum Wasserstein distance attains full discriminative power with $\Theta(N^2 \log N)$ samples. These results provide principled guidance for the design of loss functions in quantum machine learning, which we illustrate in the training quantum denoising diffusion probabilistic models.
Problem

Research questions and friction points this paper is trying to address.

quantum ensembles
distance metrics
discriminative power
statistical efficiency
quantum machine learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

quantum ensembles
maximum mean discrepancy
MMD-k
sample complexity
quantum Wasserstein distance
🔎 Similar Papers
No similar papers found.
Jian Yao
Jian Yao
Wuhan University
Computer VisionAI3DRoboticsSLAM
P
Pengtao Li
Department of Mathematics, University of Southern California, Los Angeles, California 90089, USA
Xiaohui Chen
Xiaohui Chen
University of Southern California
high-dimensional statisticsmachine learningoptimal transport
Q
Quntao Zhuang
Ming Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, California 90089, USA; Department of Physics and Astronomy, University of Southern California, Los Angeles, California 90089, USA