Refinements and Generalizations of the Shannon Lower Bound via Extensions of the Kraft Inequality

📅 2025-12-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
The classical Shannon lower bound lacks precision under various distortion criteria—including sliding-window distortion, D-semifaithful coding, injective (one-to-one) coding, and individual sequences. Method: This paper develops a generalized Kraft inequality framework, integrating extended Kraft inequalities, information-spectrum analysis, and sliding-window distortion modeling to derive refined Shannon-type lower bounds tailored to broad distortion measures (e.g., sliding-window Lₚ distortion) and code classes (injective codes, D-semifaithful codes). It further introduces, for the first time, a rate-distortion lower bound in the individual-sequence setting. Results: The derived bounds strictly improve upon the classical Shannon bound. They achieve significantly enhanced tightness for one-dimensional stationary sources, sliding-window Lₚ distortion, and arbitrary deterministic sequences. This work provides a more general and accurate characterization of fundamental performance limits in lossy compression theory.

Technology Category

Application Category

📝 Abstract
We derive a few extended versions of the Kraft inequality for lossy compression, which pave the way to the derivation of several refinements and extensions of the well known Shannon lower bound in a variety of instances of rate-distortion coding. These refinements and extensions include sharper bounds for one-to-one codes and $D$-semifaithful codes, a Shannon lower bound for distortion measures based on sliding-window functions, and an individual-sequence counterpart of the Shannon lower bound.
Problem

Research questions and friction points this paper is trying to address.

Extends Kraft inequality for lossy compression
Refines Shannon lower bound for coding
Generalizes bounds to various distortion measures
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extended Kraft inequality for lossy compression
Sharper bounds for one-to-one and D-semifaithful codes
Shannon lower bound for sliding-window distortion measures
🔎 Similar Papers
No similar papers found.