🤖 AI Summary
This work investigates the exponential strong converse for lossy distributed source coding with side information (the Wyner–Ziv problem) and distributed function computation. Specifically, it establishes the first tight exponential strong converse theorem characterizing the optimal exponential decay rate of the error probability when the coding rate exceeds the rate-distortion or function-computation capacity. Methodologically, the analysis integrates measure transformation, soft Markov constraints, and Poisson matching techniques, leverages recent advances on the Wyner–Ahlswede–Korner problem, and develops refined information-theoretic inequalities. Key contributions include: (i) a rigorous proof of the necessity of soft Markov constraints, accompanied by an explicit construction demonstrating their sufficiency; and (ii) a complete characterization of the tight strong converse exponents for both problems, achieving exact matching between converse and achievability bounds—thereby confirming that the error probability decays to zero at the theoretically optimal exponential rate.
📝 Abstract
The exponential strong converse for a coding problem states that, if a coding rate is beyond the theoretical limit, the correct probability converges to zero exponentially. For the lossy source coding with side-information, also known as the Wyner-Ziv (WZ) problem, a lower bound on the strong converse exponent was derived by Oohama. In this paper, we derive the tight strong converse exponent for the WZ problem; as a special case, we also derive the tight strong converse exponent for the distributed function computation problem. For the converse part, we use the change-of-measure argument developed in the literature and the soft Markov constraint introduced by Oohama; the matching achievability is proved via the Poisson matching approach recently introduced by Li and Anantharam. Our result is build upon the recently derived tight strong converse exponent for the Wyner-Ahlswede-Korner (WAK) problem; however, compared to the WAK problem, more sophisticated argument is needed. As an illustration of the necessity of the soft Markov constraint, we present an example such that the soft Markov constraint is strictly positive.