🤖 AI Summary
Traditional modeling of communication networks using random variables fails to formally express structural network constraints. Method: This work models communication networks—including network coding and index coding—as confusion hypergraphs, introduces a hypergraph-based Heyting algebra to support logical operations (conjunction, disjunction, implication) on information, and encodes network constraints as intuitionistic logic formulas. Contribution/Results: We establish the first Curry–Howard–style correspondence between coding problems and intuitionistic logic formulas, enabling algebraic, automated derivation of optimal codes from network constraints. Our framework precisely computes the optimal communication cost in log-scale—matching the hypergraph entropy up to a bounded error—and unifies classical coding scenarios (e.g., linear, vector, and functional coding) under a single formalism. It provides a verifiable, composable, logic-based framework for coding design, bridging information theory and constructive logic.
📝 Abstract
We propose using confusion hypergraphs (hyperconfusions) as a model of information. In contrast to the conventional approach using random variables, we can now perform conjunction, disjunction and implication of information, forming a Heyting algebra. Using the connection between Heyting algebra and intuitionistic logic, we can express the requirements of a communication network (e.g., network coding, index coding, Slepian-Wolf coding) as a logical formula, allowing us to use the hypergraph Heyting algebra to directly compute the optimal coding scheme. The optimal communication cost is simply given by the entropy of the hypergraph (within a logarithmic gap). This gives a surprising correspondence between coding settings and logical formulae, similar to the Curry-Howard correspondence between proofs and computer programs.