🤖 AI Summary
This study addresses the theoretical gap in semantic information and communication for logical reasoning tasks by proposing the first semantic information theory framework tailored to first-order logic (FOL). Methodologically, it grounds semantic measures in world-state uncertainty, introducing optimizable, formally defined quantities—including semantic entropy, semantic conditional entropy, and semantic mutual information—that rigorously decouple the physical cost of symbol transmission from the logical content conveyed. It further establishes a cross-logical-system framework for semantic information comparability and directly couples semantic fidelity to reasoning performance objectives. Contributions include: (1) a joint optimization paradigm unifying semantic compression and inference-driven representation learning; (2) controllable trade-offs between transmission cost and logical content preservation in semantic compression; and (3) empirical validation on deductive reasoning tasks demonstrating that world-state-aware representations significantly improve accuracy.
📝 Abstract
First-Order Logic (FOL), also called first-order predicate calculus, is a formal language that provides a framework to comprehensively represent a world and its present state, including all of its entities, attributes, and complex interrelations, irrespective of their physical modality (e.g., text, image, or sensor data). Grounded in this universal representation, this paper develops a mathematical theory for semantic information and communication tailored to tasks involving logical reasoning and inference. For semantic communication, our framework distinguishes between two fundamental components: the physical cost of transmitting symbols of the FOL language and the logical content those symbols represent. A calibrated measure for semantic content is proposed, which allows for the consistent comparison of information value across different logical systems. This measure quantifies the degree to which a message reduces uncertainty about the true state of the world. Building on this measure, semantic entropy, conditional and mutual information metrics are defined. These metrics are then used to formulate optimizable objectives for semantic communication, designed to preserve the information most relevant for logical reasoning task at the receiver while adhering to a transmission budget. The framework's operational value is demonstrated through experiments in semantic compression, where the proposed objectives are used to manage the trade-off between transmission cost and the preservation of logical content; and deductive inference, where increasing world-state awareness improves deduction performance.