High signal-to-noise ratio asymptotics of entropy-constrained Gaussian channel capacity

πŸ“… 2026-01-14
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work investigates the capacity of Gaussian channels under an input entropy constraint in the high signal-to-noise ratio (SNR) regime. By leveraging information-theoretic tools, asymptotic analysis, and the theory of discrete Gaussian distributions, it establishes for the first time that the capacity-achieving input distribution is a discrete Gaussian supported on a scaled integer lattice. The primary contribution lies in precisely characterizing the exponential decay rate of the gap between channel capacity and the entropy-constrained input’s mutual information. The authors prove that this gap vanishes exponentially fast as SNR increases, thereby fully elucidating the capacity behavior of entropy-constrained Gaussian channels in the high-SNR limit.

Technology Category

Application Category

πŸ“ Abstract
We study the input-entropy-constrained Gaussian channel capacity problem in the asymptotic high signal-to-noise ratio (SNR) regime. We show that the capacity-achieving distribution as SNR goes to infinity is given by a discrete Gaussian distribution supported on a scaled integer lattice. Further, we show that the gap between the input entropy and the capacity decreases to zero exponentially in SNR, and characterize this exponent.
Problem

Research questions and friction points this paper is trying to address.

Gaussian channel
entropy-constrained
high SNR
channel capacity
asymptotics
Innovation

Methods, ideas, or system contributions that make the work stand out.

entropy-constrained capacity
discrete Gaussian distribution
high SNR asymptotics
integer lattice
exponential decay
πŸ”Ž Similar Papers
No similar papers found.