🤖 AI Summary
This work addresses the joint communication and compression problem for multi-node acyclic noisy networks. We propose the first unified one-shot coding framework, introducing a novel exponential-process refinement lemma—developed from the Poisson matching lemma—combined with random codebook construction and joint typicality decoding to establish a general single-shot performance bound encompassing source coding, channel coding, and computation coding. Theoretically, our contribution is threefold: (i) it provides the first concise, unified one-shot analysis for multi-hop noisy networks; (ii) the derived one-shot theorem recovers numerous asymptotic bounds as corollaries; and (iii) it yields several new tight achievable bounds for relay, broadcast, and distributed function computation scenarios, significantly improving finite-blocklength performance characterization in both accuracy and generality.
📝 Abstract
We present a unified one-shot coding framework designed for communication and compression of messages among multiple nodes across a general acyclic noisy network. Our setting can be seen as a one-shot version of the acyclic discrete memoryless network studied by Lee and Chung, and noisy network coding studied by Lim, Kim, El Gamal and Chung. We design a proof technique, called the exponential process refinement lemma, that is rooted in the Poisson matching lemma by Li and Anantharam, and can significantly simplify the analyses of one-shot coding over multi-hop networks. Our one-shot coding theorem not only recovers a wide range of existing asymptotic results, but also yields novel one-shot achievability results in different multi-hop network information theory problems. In a broader context, our framework provides a unified one-shot bound applicable to any combination of source coding, channel coding and coding for computing problems.