Separable Computation of Information Measures

📅 2025-01-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenge of directly estimating information-theoretic measures—such as mutual information, f-information, and Wyner’s common information—in high-dimensional settings. Methodologically, it proposes a learnable, separable computation framework: for feature representations satisfying mild regularity conditions, these information measures decompose into independent terms estimable separately at the feature level and the raw-data level. The framework unifies representation learning, f-divergence theory, and common information modeling into a coherent analytical paradigm for information measurement. Theoretically, it establishes novel connections between information measures and underlying statistical dependence structures, providing rigorous theoretical guarantees for information bottleneck methods and related principles. Empirically, the approach significantly enhances the feasibility, robustness, and interpretability of information estimation in high-dimensional scenarios, enabling reliable quantification where conventional estimators fail.

Technology Category

Application Category

📝 Abstract
We study a separable design for computing information measures, where the information measure is computed from learned feature representations instead of raw data. Under mild assumptions on the feature representations, we demonstrate that a class of information measures admit such separable computation, including mutual information, $f$-information, Wyner's common information, G{'a}cs--K{""o}rner common information, and Tishby's information bottleneck. Our development establishes several new connections between information measures and the statistical dependence structure. The characterizations also provide theoretical guarantees of practical designs for estimating information measures through representation learning.
Problem

Research questions and friction points this paper is trying to address.

Information Quantification
Learned Features
Theoretical Framework
Innovation

Methods, ideas, or system contributions that make the work stand out.

Information Quantification
Learned Features
Statistical Relationship
🔎 Similar Papers
No similar papers found.