Discrete Layered Entropy, Conditional Compression and a Tighter Strong Functional Representation Lemma

📅 2025-01-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenge of directly optimizing and analyzing Shannon entropy by proposing **discrete hierarchical entropy**—a novel entropy measure that approximates Shannon entropy with logarithmic precision via a piecewise-linear construction. The measure admits linear programming formulations, ensuring both computational tractability and information-theoretic interpretability. Methodologically, we first construct a piecewise-linear entropy approximation satisfying key analytical properties—including convexity and monotonicity—while retaining a clear coding-theoretic interpretation. Building on this, we derive a new upper bound in the strong functional representation lemma, which strictly improves upon the best existing bound when mutual information exceeds 2. We further establish a computable lower bound on the length of conditionally one-to-one codes and enable efficient estimation of entropy for monotonic mixture distributions. Our core contribution lies in unifying theoretical tightness with algorithmic feasibility, thereby substantially enhancing the practical utility and precision of entropy-related information-theoretic bounds.

Technology Category

Application Category

📝 Abstract
We study a quantity called discrete layered entropy, which approximates the Shannon entropy within a logarithmic gap. Compared to the Shannon entropy, the discrete layered entropy is piecewise linear, approximates the expected length of the optimal one-to-one non-prefix-free encoding, and satisfies an elegant conditioning property. These properties make it useful for approximating the Shannon entropy in linear programming, studying the optimal length of conditional encoding, and bounding the entropy of monotonic mixture distributions. In particular, it can give a bound for the strong functional representation lemma that improves upon the best bound (as long as the mutual information is at least 2).
Problem

Research questions and friction points this paper is trying to address.

Discrete Hierarchical Entropy
Shannon Entropy
Optimal Bound Theorems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Discrete Hierarchical Entropy
Conditional Compression
Piecewise Linear Nature