Accelerated optimization of measured relative entropies

📅 2025-11-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the efficient computation of the measured relative entropy and measured Rényi relative entropy between quantum states ρ and σ. Leveraging their variational definitions, we confront a nonsmooth optimization problem; we establish, for the first time, that the associated objective function is β-smooth and γ-strongly convex/concave, with condition number governed by the max-relative entropy of ρ with respect to σ. Building on this, we propose a Nesterov-accelerated projected gradient method, integrating matrix calculus and variational characterizations for robust numerical evaluation. Compared to standard semidefinite programming approaches, our algorithm reduces memory complexity substantially and achieves significant speedups in well-conditioned regimes. It enables high-precision, large-scale computation of measured relative entropies, providing a scalable computational tool for quantum hypothesis testing and quantum information theory.

Technology Category

Application Category

📝 Abstract
The measured relative entropy and measured Rényi relative entropy are quantifiers of the distinguishability of two quantum states $ρ$ and $σ$. They are defined as the maximum classical relative entropy or Rényi relative entropy realizable by performing a measurement on $ρ$ and $σ$, and they have interpretations in terms of asymptotic quantum hypothesis testing. Crucially, they can be rewritten in terms of variational formulas involving the optimization of a concave or convex objective function over the set of positive definite operators. In this paper, we establish foundational properties of these objective functions by analyzing their matrix gradients and Hessian superoperators; namely, we prove that these objective functions are $β$-smooth and $γ$-strongly convex / concave, where $β$ and $γ$ depend on the max-relative entropies of $ρ$ and $σ$. A practical consequence of these properties is that we can conduct Nesterov accelerated projected gradient descent / ascent, a well known classical optimization technique, to calculate the measured relative entropy and measured Rényi relative entropy to arbitrary precision. These algorithms are generally more memory efficient than our previous algorithms based on semi-definite optimization [Huang and Wilde, arXiv:2406.19060], and for well conditioned states $ρ$ and $σ$, these algorithms are notably faster.
Problem

Research questions and friction points this paper is trying to address.

Optimizing variational formulas for measured quantum relative entropies
Establishing smoothness and convexity properties of objective functions
Developing accelerated gradient algorithms for efficient entropy calculation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Used Nesterov accelerated projected gradient descent
Optimized concave convex objective functions
Improved memory efficiency and speed
🔎 Similar Papers
No similar papers found.