🤖 AI Summary
This work addresses the limitations of classical information theory in characterizing distributional uncertainty arising from non-stationary and heterogeneous randomness in complex communication networks. It introduces sublinear expectation theory into information theory for the first time, establishing a nonlinear information-theoretic framework that rigorously defines nonlinear entropy and mutual information. By integrating the nonlinear strong law of large numbers with distributional uncertainty modeling, the paper develops coding theorems applicable to uncertain sources and channels. The main contributions include proving that nonlinear entropy serves as an upper bound on the achievable source coding rate under maximum error probability and characterizing its accumulation points under minimum error probability. Furthermore, the study derives an upper bound on the capacity of uncertain channels and establishes the asymptotic performance of the rate-distortion function in terms of accumulation points.
📝 Abstract
A mathematical framework for information-theoretic analysis is established, with a new viewpoint of describing transmitted messages and communication channels by the nonlinear expectation theory, beyond the framework of classical probability theory. The major motivation of this research is to emphasize the probabilistic distribution uncertainty within the ever increasingly complex communication networks, where random phenomena are often nonstationary, heterogeneous, and cannot be characterized by a single probability distribution. Based on the nonlinear expectation theory, in this paper we first explicitly define several fundamental concepts, such as nonlinear information entropy, nonlinear joint entropy, nonlinear conditional entropy and nonlinear mutual information, and establish their basic properties. Secondly, by using the strong law of large numbers under sublinear expectations, we propose a nonlinear source coding theorem, which shows that the nonlinear information entropy is the upper bound of the achievable coding rate of sources whose distributions are uncertain under the maximum error probability criterion, and determines a cluster point of the coding rate of such sources under the minimum error probability criterion. Thirdly, we propose a nonlinear channel coding theorem, which gives the explicit expression of the upper bound under the maximum error probability criterion and a cluster point under the minimum error probability criterion, respectively, for the achievable coding rate of communication channels whose distributions are uncertain. Additionally, we propose a nonlinear rate-distortion source coding theorem, proving that the rate distortion function based on the nonlinear mutual information is a cluster point of the lossy compression performance of uncertain-distribution sources under the minimum expected distortion criterion.