🤖 AI Summary
This work addresses the computation of the Petz–Rényi capacity for classical-quantum channels in the regime α ∈ (0,1). To this end, it proposes an iterative algorithm based on mirror descent—equivalently, exponentiated gradient—and introduces this method for the first time to the optimization of Petz–Rényi capacity, thereby generalizing the classical Blahut–Arimoto algorithm. By establishing the relative smoothness of the objective function with respect to the entropy geometry, the authors prove that the algorithm achieves global sublinear convergence over a truncated probability simplex. Moreover, under a non-degeneracy condition on the tangent space, they further establish local linear convergence in the sense of Kullback–Leibler divergence and provide an explicit contraction factor.
📝 Abstract
We study the computation of the $\alpha$-R\'enyi capacity of a classical-quantum (c-q) channel for $\alpha\in(0,1)$. We propose an exponentiated-gradient (mirror descent) iteration that generalizes the Blahut-Arimoto algorithm. Our analysis establishes relative smoothness with respect to the entropy geometry, guaranteeing a global sublinear convergence of the objective values. Furthermore, under a natural tangent-space nondegeneracy condition (and a mild spectral lower bound in one regime), we prove local linear (geometric) convergence in Kullback-Leibler divergence on a truncated probability simplex, with an explicit contraction factor once the local curvature constants are bounded.