🤖 AI Summary
This paper investigates channel coding over discrete memoryless channels (DMCs) under simultaneous constraints on both the mean and variance of codeword cost. To overcome limitations of conventional single-constraint models, it establishes the first exact second-order coding rate characterization under joint mean–variance cost constraints, derives the capacity–cost function, and proves a strong converse theorem. A generalized timid/bold coding framework is introduced to unify treatment of arbitrary cost constraints; moreover, a novel feedback-aided coding scheme is designed, with a rigorous proof that feedback strictly improves second-order performance—quantified via a closed-form expression for the gain. The optimal second-order coding rate is obtained in analytic form, and the minimal average error probability is equivalently characterized. The core contribution is the development of the first complete second-order information-theoretic theory for joint mean–variance cost constraints.
📝 Abstract
We consider channel coding for discrete memoryless channels (DMCs) with a novel cost constraint that constrains both the mean and the variance of the cost of the codewords. We show that the maximum (asymptotically) achievable rate under the new cost formulation is equal to the capacity-cost function; in particular, the strong converse holds. We further characterize the optimal second-order coding rate of these cost-constrained codes; in particular, the optimal second-order coding rate is finite. We then show that the second -order coding performance is strictly improved with feedback using a new variation of timid/bold coding, significantly broadening the applicability of timid/bold coding schemes from unconstrained compound-dispersion chan-nels to all cost-constrained channels. Equivalent results on the minimum average probability of error are also given.