Large Deviation Analysis for the Reverse Shannon Theorem

📅 2024-10-10
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This paper investigates the channel simulation problem: approximating a target noisy channel using shared unlimited randomness and a noiseless communication link. It introduces Rényi divergence (with order α ∈ (0, ∞)) as a unified fidelity measure for approximation—yielding the first systematic development of an inverse Shannon theorem in the Rényi sense. The work precisely characterizes the Rényi simulation rate—the minimum communication rate required to drive the Rényi divergence asymptotically to zero—and rigorously derives the reliability function (exponential decay of divergence) in the super-critical regime (rates above the simulation rate) and the strong converse function (linear growth of divergence) in the sub-critical regime (rates below the simulation rate). This framework unifies and generalizes classical results based on fidelity and total variation distance, providing a finer, parameterized information-theoretic characterization of channel simulation with explicit control over the divergence order α.

Technology Category

Application Category

📝 Abstract
Channel simulation is to simulate a noisy channel using noiseless channels with unlimited shared randomness. This can be interpreted as the reverse problem to Shannon's noisy coding theorem. In contrast to previous works, our approach employs R'enyi divergence (with the parameter $alphain(0,infty)$) to measure the level of approximation. Specifically, we obtain the reverse Shannon theorem under the R'enyi divergence, which characterizes the R'enyi simulation rate, the minimum communication cost rate required for the R'enyi divergence vanishing asymptotically. We also investigate the behaviors of the R'enyi divergence when the communication cost rate is above or below the R'enyi simulation rate. When the communication cost rate is above the R'enyi simulation rate, we provide a complete characterization of the convergence exponent, called the reliability function. When the communication cost rate is below the R'enyi simulation rate, we determine the linear increasing rate for the R'enyi divergence with parameter $alphain(0,infty]$, which implies the strong converse exponent for the $alpha$-order fidelity.
Problem

Research questions and friction points this paper is trying to address.

Analyzing channel simulation using Rényi divergence metrics
Determining minimum communication cost for vanishing Rényi divergence
Characterizing convergence behavior above and below simulation rate
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Rényi divergence for approximation measurement
Characterizes Rényi simulation rate for communication
Analyzes Rényi divergence behavior above/below rate
🔎 Similar Papers
No similar papers found.
S
Shizhe Li
Institute for Advanced Study in Mathematics, School of Mathematics, Harbin Institute of Technology, Nangang District, Harbin 150001, China
K
Ke Li
Institute for Advanced Study in Mathematics, Harbin Institute of Technology, Nangang District, Harbin 150001, China
L
Lei Yu
School of Statistics and Data Science, LPMC, KLMDASR, and LEBPS, Nankai University, Tianjin 300071, China