🤖 AI Summary
The classical Shannon lower bound lacks precision under various distortion criteria—including sliding-window distortion, D-semifaithful coding, injective (one-to-one) coding, and individual sequences.
Method: This paper develops a generalized Kraft inequality framework, integrating extended Kraft inequalities, information-spectrum analysis, and sliding-window distortion modeling to derive refined Shannon-type lower bounds tailored to broad distortion measures (e.g., sliding-window Lₚ distortion) and code classes (injective codes, D-semifaithful codes). It further introduces, for the first time, a rate-distortion lower bound in the individual-sequence setting.
Results: The derived bounds strictly improve upon the classical Shannon bound. They achieve significantly enhanced tightness for one-dimensional stationary sources, sliding-window Lₚ distortion, and arbitrary deterministic sequences. This work provides a more general and accurate characterization of fundamental performance limits in lossy compression theory.
📝 Abstract
We derive a few extended versions of the Kraft inequality for lossy compression, which pave the way to the derivation of several refinements and extensions of the well known Shannon lower bound in a variety of instances of rate-distortion coding. These refinements and extensions include sharper bounds for one-to-one codes and $D$-semifaithful codes, a Shannon lower bound for distortion measures based on sliding-window functions, and an individual-sequence counterpart of the Shannon lower bound.