The First and Second Order Asymptotics of Covert Communication over AWGN Channels

📅 2023-05-29
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the asymptotic covert capacity over an additive white Gaussian noise (AWGN) channel under a KL-divergence covertness constraint δ. For n channel uses, it establishes— for the first time—the exact first- and second-order asymptotics: the first-order term is √(nδ ln e), and the second-order term is (nδ)^(1/4)(ln e)^(3/4)√2·Q⁻¹(ε). Methodologically, it introduces a novel information-geometric quasi-ε-neighborhood construction, extending the one-dimensional Gaussian minimum-KL result to n dimensions; this is combined with truncated Gaussian coding, refined KL-divergence analysis, and second-moment-constrained power optimization to achieve the optimal power scaling law. Crucially, the theoretical analysis derives, for the first time, an explicit link between the average power upper bound and the covertness parameter δ. The results establish the fundamental second-order asymptotic covert capacity benchmark for AWGN channels.
📝 Abstract
This paper investigates the asymptotics of the maximal throughput of communication over AWGN channels by $n$ channel uses under a covert constraint in terms of an upper bound $delta$ of Kullback-Leibler divergence (KL divergence). It is shown that the first and second order asymptotics of the maximal throughput are $sqrt{ndelta log e}$ and $(2)^{1/2}(ndelta)^{1/4}(log e)^{3/4}cdot Q^{-1}(epsilon)$, respectively. The technique we use in the achievability is quasi-$varepsilon$-neighborhood notion from information geometry. For finite blocklength $n$, the generating distributions are chosen to be a family of truncated Gaussian distributions with decreasing variances. The law of decreasing is carefully designed so that it maximizes the throughput at the main channel in the asymptotic sense under the condition that the output distributions satisfy the covert constraint. For the converse, the optimality of Gaussian distribution for minimizing KL divergence under the second order moment constraint is extended from dimension $1$ to dimension $n$. Based on that, we establish an upper bound on the average power of the code to satisfy the covert constraint, which further leads to the direct converse bound in terms of covert metric.
Problem

Research questions and friction points this paper is trying to address.

Studies maximal throughput asymptotics in covert AWGN communication
Analyzes first and second-order throughput under KL divergence constraint
Explores truncated Gaussian distributions for optimal covert code design
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses quasi-epsilon-neighborhood from information geometry
Employs truncated Gaussian distributions with decreasing variances
Extends Gaussian optimality to n-dimensions for KL divergence
🔎 Similar Papers
No similar papers found.
X
Xinchun Yu
Institute of Information and Data Science, Tsinghua Shenzhen International Graduate School, Shenzhen, China
Shuangqing Wei
Shuangqing Wei
School of EECS, Louisiana State University
Communication TheoryInformation Theoryand Statistical Inference
Shao-Lun Huang
Shao-Lun Huang
T-SIGS, Tsinghua University
Information TheoryMachine learning
X
Xiao-Ping Zhang
Institute of Information and Data Science, Tsinghua Shenzhen International Graduate School, Shenzhen, China