Communication Complexity of Exact Sampling under R'enyi Information

📅 2025-06-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
研究在共享序列分布Q时从分布P精确采样的通信复杂度问题。使用Poisson函数表示法,将最小期望消息长度上界推广到Rényi熵和Campbell成本,上下界近似为Rényi散度D_{1/α}(P||Q)。

Technology Category

Application Category

📝 Abstract
We study the problem of communicating a sample from a probability distribution $P$ given shared access to a sequence distributed according to another probability distribution $Q$. Li and El Gamal used the Poisson functional representation to show that the minimum expected message length to communicate a sample from $P$ can be upper bounded by $D(P||Q) + log (D(P||Q) + 1) + 4$, where $D(, cdot , || , cdot, )$ is the Kullback-Leibler divergence. We generalize this and related results to a cost which is exponential in the message length, specifically $L(t)$, Campbell's average codeword length of order $t$, and to R'enyi's entropy. We lower bound the Campbell cost and R'enyi entropy of communicating a sample under any (possibly noncausal) sampling protocol, showing that it grows approximately as $D_{1/alpha}(P||Q)$, where $D_eta(,cdot ,|| ,cdot,)$ is the R'enyi divergence of order $eta$. Using the Poisson functional representation, we prove an upper bound on $L(t)$ and $H_alpha(K)$ which has a leading R'enyi divergence term with order within $epsilon$ of the lower bound. Our results reduce to the bounds of Harsha et al. as $alpha o 1$. We also provide numerical examples comparing the bounds in the cases of normal and Laplacian distributions, demonstrating that the upper and lower bounds are typically within 5-10 bits of each other.
Problem

Research questions and friction points this paper is trying to address.

Study communication complexity of exact sampling under Rényi information
Generalize bounds for Campbell cost and Rényi entropy in sampling
Compare upper and lower bounds for normal and Laplacian distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Poisson functional representation for bounds
Generalizes to Campbell cost and Rényi entropy
Compares bounds via numerical examples
🔎 Similar Papers
No similar papers found.
S
Spencer Hill
Mathematics and Statistics Department, Queen's University, Kingston, Ontario, Canada
Fady Alajaji
Fady Alajaji
Queen's University
Information TheoryCommunicationsJoint source-channel codingPolya contagion networksMachine learning
T
Tam'as Linder
Mathematics and Statistics Department, Queen's University, Kingston, Ontario, Canada