🤖 AI Summary
研究在共享序列分布Q时从分布P精确采样的通信复杂度问题。使用Poisson函数表示法,将最小期望消息长度上界推广到Rényi熵和Campbell成本,上下界近似为Rényi散度D_{1/α}(P||Q)。
📝 Abstract
We study the problem of communicating a sample from a probability distribution $P$ given shared access to a sequence distributed according to another probability distribution $Q$. Li and El Gamal used the Poisson functional representation to show that the minimum expected message length to communicate a sample from $P$ can be upper bounded by $D(P||Q) + log (D(P||Q) + 1) + 4$, where $D(, cdot , || , cdot, )$ is the Kullback-Leibler divergence. We generalize this and related results to a cost which is exponential in the message length, specifically $L(t)$, Campbell's average codeword length of order $t$, and to R'enyi's entropy. We lower bound the Campbell cost and R'enyi entropy of communicating a sample under any (possibly noncausal) sampling protocol, showing that it grows approximately as $D_{1/alpha}(P||Q)$, where $D_eta(,cdot ,|| ,cdot,)$ is the R'enyi divergence of order $eta$. Using the Poisson functional representation, we prove an upper bound on $L(t)$ and $H_alpha(K)$ which has a leading R'enyi divergence term with order within $epsilon$ of the lower bound. Our results reduce to the bounds of Harsha et al. as $alpha o 1$. We also provide numerical examples comparing the bounds in the cases of normal and Laplacian distributions, demonstrating that the upper and lower bounds are typically within 5-10 bits of each other.