π€ AI Summary
This work investigates the capacity of Gaussian channels under an input entropy constraint in the high signal-to-noise ratio (SNR) regime. By leveraging information-theoretic tools, asymptotic analysis, and the theory of discrete Gaussian distributions, it establishes for the first time that the capacity-achieving input distribution is a discrete Gaussian supported on a scaled integer lattice. The primary contribution lies in precisely characterizing the exponential decay rate of the gap between channel capacity and the entropy-constrained inputβs mutual information. The authors prove that this gap vanishes exponentially fast as SNR increases, thereby fully elucidating the capacity behavior of entropy-constrained Gaussian channels in the high-SNR limit.
π Abstract
We study the input-entropy-constrained Gaussian channel capacity problem in the asymptotic high signal-to-noise ratio (SNR) regime. We show that the capacity-achieving distribution as SNR goes to infinity is given by a discrete Gaussian distribution supported on a scaled integer lattice. Further, we show that the gap between the input entropy and the capacity decreases to zero exponentially in SNR, and characterize this exponent.