Nonlinear Information Theory: Characterizing Distributional Uncertainty in Communication Models with Sublinear Expectation

📅 2026-03-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of classical information theory in characterizing distributional uncertainty arising from non-stationary and heterogeneous randomness in complex communication networks. It introduces sublinear expectation theory into information theory for the first time, establishing a nonlinear information-theoretic framework that rigorously defines nonlinear entropy and mutual information. By integrating the nonlinear strong law of large numbers with distributional uncertainty modeling, the paper develops coding theorems applicable to uncertain sources and channels. The main contributions include proving that nonlinear entropy serves as an upper bound on the achievable source coding rate under maximum error probability and characterizing its accumulation points under minimum error probability. Furthermore, the study derives an upper bound on the capacity of uncertain channels and establishes the asymptotic performance of the rate-distortion function in terms of accumulation points.

Technology Category

Application Category

📝 Abstract
A mathematical framework for information-theoretic analysis is established, with a new viewpoint of describing transmitted messages and communication channels by the nonlinear expectation theory, beyond the framework of classical probability theory. The major motivation of this research is to emphasize the probabilistic distribution uncertainty within the ever increasingly complex communication networks, where random phenomena are often nonstationary, heterogeneous, and cannot be characterized by a single probability distribution. Based on the nonlinear expectation theory, in this paper we first explicitly define several fundamental concepts, such as nonlinear information entropy, nonlinear joint entropy, nonlinear conditional entropy and nonlinear mutual information, and establish their basic properties. Secondly, by using the strong law of large numbers under sublinear expectations, we propose a nonlinear source coding theorem, which shows that the nonlinear information entropy is the upper bound of the achievable coding rate of sources whose distributions are uncertain under the maximum error probability criterion, and determines a cluster point of the coding rate of such sources under the minimum error probability criterion. Thirdly, we propose a nonlinear channel coding theorem, which gives the explicit expression of the upper bound under the maximum error probability criterion and a cluster point under the minimum error probability criterion, respectively, for the achievable coding rate of communication channels whose distributions are uncertain. Additionally, we propose a nonlinear rate-distortion source coding theorem, proving that the rate distortion function based on the nonlinear mutual information is a cluster point of the lossy compression performance of uncertain-distribution sources under the minimum expected distortion criterion.
Problem

Research questions and friction points this paper is trying to address.

distributional uncertainty
nonlinear expectation
communication models
information theory
sublinear expectation
Innovation

Methods, ideas, or system contributions that make the work stand out.

nonlinear expectation
distributional uncertainty
nonlinear information entropy
nonlinear coding theorems
sublinear expectation
🔎 Similar Papers
No similar papers found.
W
Wen-Xuan Lang
National Center for Mathematics and Interdisciplinary Sciences, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190, China
Shaoshi Yang
Shaoshi Yang
Professor, Beijing University of Posts and Telecommunications
6GMIMOdistributed AIwireless localizationflying ad hoc network
Jianhua Zhang
Jianhua Zhang
Beijing University of Posts and Telecommunications, CHINA
Signal ProcessingWireless CommunicationRadio channel Measurement and ModellingChannel SimulationTerminal Testing
Z
Zhiming Ma
National Center for Mathematics and Interdisciplinary Sciences, Academy of Mathematics and Systems Science, Chinese Academy of Sciences; University of Chinese Academy of Sciences, Beijing 100190, China