🤖 AI Summary
This work investigates the strong converse exponent for the classical soft-covering problem: the slowest exponential rate at which the total variation distance between the synthesized distribution and the target product distribution converges to one when the coding rate falls below the channel’s mutual information. Leveraging type-theoretic analysis and Rényi mutual information, we establish the first tight upper and lower bounds on this strong converse exponent. Crucially, we show that the exponent admits an exact characterization as an optimization problem involving Rényi mutual information. The derived bounds precisely match the performance of the best-known explicit code constructions, thereby providing a rigorous and complete theoretical characterization of the strong converse exponent. This result significantly advances the fundamental understanding of the asymptotic limits of soft covering, closing a long-standing gap in the precise quantification of its strong converse behavior.
📝 Abstract
In this paper, we provide a lower and an upper bound for the strong converse exponent of the soft covering problem in the classical setting. This exponent characterizes the slowest achievable convergence speed of the total variation to one when a code with a rate below mutual information is applied to a discrete memoryless channel for synthesizing a product output distribution. We employ a type-based approach and additionally propose an equivalent form of our upper bound using the R'enyi mutual information. Future works include tightening these two bounds to determine the exact bound of the strong converse exponent.