Breaking through the classical Shannon entropy limit: A new frontier through logical semantics

📅 2024-12-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the fundamental question in semantic information theory: *how logical meaning enhances communication efficiency*. Methodologically, it proposes a novel paradigm embedding formal logic into classical information-theoretic frameworks by constructing a logical semantic source model that integrates deductive inference systems with typical set analysis, and designing a semantic-aware coding scheme. Theoretically, it proves that semantic constraints strictly reduce the minimum average codeword length—breaking the Shannon entropy lower bound for certain sources—and provides the first computable characterization of semantic gain alongside explicit encoder construction. Experiments demonstrate dual advantages of semantic coding in both decoding accuracy and codeword length reduction. This work challenges the traditional semantics-agnostic foundation of information theory and establishes the first logically grounded, information-theoretically rigorous, and engineering-feasible joint logic–information modeling framework for semantic communication.

Technology Category

Application Category

📝 Abstract
Information theory has provided foundations for the theories of several application areas critical for modern society, including communications, computer storage, and AI. A key aspect of Shannon's 1948 theory is a sharp lower bound on the number of bits needed to encode and communicate a string of symbols. When he introduced the theory, Shannon famously excluded any notion of semantics behind the symbols being communicated. This semantics-free notion went on to have massive impact on communication and computing technologies, even as multiple proposals for reintroducing semantics in a theory of information were being made, notably one where Carnap and Bar-Hillel used logic and reasoning to capture semantics. In this paper we present, for the first time, a Shannon-style analysis of a communication system equipped with a deductive reasoning capability, implemented using logical inference. We use some of the most important techniques developed in information theory to demonstrate significant and sometimes surprising gains in communication efficiency availed to us through such capability, demonstrated also through practical codes. We thus argue that proposals for a semantic information theory should include the power of deductive reasoning to magnify the value of transmitted bits as we strive to fully unlock the inherent potential of semantics.
Problem

Research questions and friction points this paper is trying to address.

Symbolic Logic
Information Theory
Communication Systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Logical Reasoning
Communication Efficiency
Information Theory
🔎 Similar Papers
No similar papers found.
L
L. Lastras
IBM Research AI, Yorktown Heights, NY, 10598, USA.
B
B. Trager
IBM Research AI, Yorktown Heights, NY, 10598, USA.
Jonathan Lenchner
Jonathan Lenchner
IBM T.J. Watson Research Center
Computational ComplexityCombinatorial & Computational GeometryRobotics
Wojciech Szpankowski
Wojciech Szpankowski
Purdue University
analysis of algorithmsinformation theory
Chai Wah Wu
Chai Wah Wu
Principal Research Scientist, IBM T. J. Watson Research Center
Synchronizationchaotic dynamicsnonlinear dynamicsimage processingdigital halftoning
M
M. Squillante
IBM Research AI, Yorktown Heights, NY, 10598, USA.
A
Alexander Gray
Centaur AI Institute, Purdue University, West Lafayette, IN, 47907, USA.