Information Physics of Intelligence: Unifying Logical Depth and Entropy under Thermodynamic Constraints

📅 2025-11-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the fundamental tension between storage capacity and computational efficiency (inference latency) in large language models by establishing the first unified information–physics framework that quantifies the thermodynamic costs of generative compression and memory retrieval. Methodologically, it integrates Shannon entropy, computational complexity theory, and information–physical mapping principles to introduce “derivation entropy” — a universal metric for the minimum effective work required to reach a target state via generation. The framework identifies a thermodynamic phase-transition critical point between memory retrieval and generative computation, yielding an “energy–time–space” conservation law. Key contributions include: (i) the first demonstration that derivation entropy minimization constitutes a shared thermodynamic principle underlying both biological intelligence and AI evolution; (ii) a physical explanation for the efficiency limits of generative models; and (iii) the establishment of fundamental theoretical bounds for energy-efficient AI architecture design.

Technology Category

Application Category

📝 Abstract
The rapid scaling of artificial intelligence models has revealed a fundamental tension between model capacity (storage) and inference efficiency (computation). While classical information theory focuses on transmission and storage limits, it lacks a unified physical framework to quantify the thermodynamic costs of generating information from compressed laws versus retrieving it from memory. In this paper, we propose a theoretical framework that treats information processing as an enabling mapping from ontological states to carrier states. We introduce a novel metric, Derivation Entropy, which quantifies the effective work required to compute a target state from a given logical depth. By analyzing the interplay between Shannon entropy (storage) and computational complexity (time/energy), we demonstrate the existence of a critical phase transition point. Below this threshold, memory retrieval is thermodynamically favorable; above it, generative computation becomes the optimal strategy. This "Energy-Time-Space" conservation law provides a physical explanation for the efficiency of generative models and offers a rigorous mathematical bound for designing next-generation, energy-efficient AI architectures. Our findings suggest that the minimization of Derivation Entropy is a governing principle for the evolution of both biological and artificial intelligence.
Problem

Research questions and friction points this paper is trying to address.

Unifying thermodynamic costs of information generation versus memory retrieval
Quantifying computational work required to derive states from logical depth
Identifying phase transition between memory retrieval and generative computation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unifying logical depth and entropy under thermodynamics
Introducing Derivation Entropy metric for computation work
Identifying phase transition between memory and computation
🔎 Similar Papers
No similar papers found.