🤖 AI Summary
This work investigates the minimum cardinality of the support set of input distributions achieving capacity over an amplitude-constrained additive white Gaussian noise (AWGN) channel. Addressing the long-standing conjecture that the support size must scale linearly in the amplitude constraint $A$, i.e., $Omega(A)$, we establish the first rigorous lower bound of order $Omega(Asqrt{log A})$, substantially improving upon prior results. Methodologically, we introduce a framework for quantifying output distribution uniformity, integrate a compact-domain wrapping mapping with the theory of best approximation of uniform distributions by finite Gaussian mixtures, and leverage stability analysis of the output distribution to complete the critical derivation. Our result refutes the conjectured optimality of linear scaling and provides, to date, the tightest known lower bound on the support size required for capacity-achieving input distributions. This advances the structural characterization of optimal inputs and contributes fundamentally to the information-theoretic foundations of amplitude-constrained channels.
📝 Abstract
We study the amplitude-constrained additive white Gaussian noise channel. It is well known that the capacity-achieving input distribution for this channel is discrete and supported on finitely many points. The best known bounds show that the support size of the capacity-achieving distribution is lower-bounded by a term of order $A$ and upper-bounded by a term of order $A^2$, where $A$ denotes the amplitude constraint. It was conjectured in [1] that the linear scaling is optimal. In this work, we establish a new lower bound of order $Asqrt{log A}$, improving the known bound and ruling out the conjectured linear scaling.
To obtain this result, we quantify the fact that the capacity-achieving output distribution is close to the uniform distribution in the interior of the amplitude constraint. Next, we introduce a wrapping operation that maps the problem to a compact domain and develop a theory of best approximation of the uniform distribution by finite Gaussian mixtures. These approximation bounds are then combined with stability properties of capacity-achieving distributions to yield the final support-size lower bound.