🤖 AI Summary
This study investigates the asymptotic behavior of rate-distortion redundancy for symmetric sources under almost-sure and expected distortion constraints. By leveraging information-theoretic analysis within a variable-length coding framework, the work establishes—for the first time—the exact asymptotic order of the rate-distortion redundancy as $\frac{\log n}{2n}$ for uniform sources satisfying symmetry conditions under symmetric distortion measures. This bound is shown to be both achievable and tight, thereby providing a precise characterization of the fundamental limits. The result delivers a sharp, non-asymptotic benchmark that closes a critical gap in the higher-order asymptotics of rate-distortion theory, offering a complete understanding of redundancy for symmetric sources under both distortion criteria.
📝 Abstract
For variable-length coding with an almost-sure distortion constraint, Zhang et al. show that for discrete sources the redundancy is upper bounded by $\log n/n$ and lower bounded (in most cases) by $\log n/(2n)$, ignoring lower order terms. For a uniform source with a distortion measure satisfying certain symmetry conditions, we show that $\log n/(2n)$ is achievable and that this cannot be improved even if one relaxes the distortion constraint to be in expectation rather than with probability one.