🤖 AI Summary
This work investigates second-order channel coding performance for discrete memoryless channels (DMCs) under mean and variance cost constraints, and analyzes the impact of feedback on achievable coding rates. Using information-theoretic analysis, second-order asymptotics, random coding and typicality arguments, and feedback coding bound derivation, we establish—rigorously for the first time—that variance-based cost constraints yield strictly higher achievable rates and larger second-order capacity terms than conventional peak-cost constraints, revealing that cost variability itself provides a fundamental coding gain—both with and without feedback. Furthermore, we show that feedback does not improve the second-order coding rate for simple-dispersion DMCs under peak-cost constraints. Collectively, these results unify the characterization of feedback effects across DMCs, AWGN channels, and parallel Gaussian channels under cost constraints, establishing a new design paradigm for resource-constrained communication systems.
📝 Abstract
Channel coding for discrete memoryless channels (DMCs) with mean and variance cost constraints has been recently introduced. We show that there is an improvement in coding performance due to cost variability, both with and without feedback. We demonstrate this improvement over the traditional almost-sure cost constraint (also called the peak-power constraint) that prohibits any cost variation above a fixed threshold. Our result simultaneously shows that feedback does not improve the second-order coding rate of simple-dispersion DMCs under the peak-power constraint. This finding parallels similar results for unconstrained simple-dispersion DMCs, additive white Gaussian noise (AWGN) channels and parallel Gaussian channels.