Kernel Trace Distance: Quantum Statistical Metric between Measures through RKHS Density Operators

📅 2025-07-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses key limitations of existing distributional distance measures—such as weak discriminability, high sensitivity to hyperparameters (e.g., kernel bandwidth), and curse-of-dimensionality in sample complexity—exemplified by Maximum Mean Discrepancy (MMD). We propose a novel Integral Probability Metric (IPM) grounded in the Schatten-p norm (p ∈ [1,2)) of covariance operators within a Reproducing Kernel Hilbert Space (RKHS). Theoretically bridging MMD and Wasserstein distances, our metric inherits MMD’s computational tractability while gaining Wasserstein’s geometric discriminability, thereby enhancing sensitivity to distributional discrepancies. It exhibits robustness to kernel hyperparameters and circumvents the dimensionality curse. Methodologically, we employ kernelized density operators with Schatten-p norm regularization to ensure theoretical soundness and numerical stability. Extensive experiments—including approximate Bayesian computation under data contamination and particle flow simulation—demonstrate superior statistical power, robustness, and practical efficacy.

Technology Category

Application Category

📝 Abstract
Distances between probability distributions are a key component of many statistical machine learning tasks, from two-sample testing to generative modeling, among others. We introduce a novel distance between measures that compares them through a Schatten norm of their kernel covariance operators. We show that this new distance is an integral probability metric that can be framed between a Maximum Mean Discrepancy (MMD) and a Wasserstein distance. In particular, we show that it avoids some pitfalls of MMD, by being more discriminative and robust to the choice of hyperparameters. Moreover, it benefits from some compelling properties of kernel methods, that can avoid the curse of dimensionality for their sample complexity. We provide an algorithm to compute the distance in practice by introducing an extension of kernel matrix for difference of distributions that could be of independent interest. Those advantages are illustrated by robust approximate Bayesian computation under contamination as well as particle flow simulations.
Problem

Research questions and friction points this paper is trying to address.

Introduces a novel distance metric between probability measures
Compares measures using Schatten norm of kernel covariance operators
Enhances discriminative power and robustness in statistical tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Schatten norm compares kernel covariance operators
Integral probability metric between MMD and Wasserstein
Kernel matrix extension for distribution differences
🔎 Similar Papers
No similar papers found.