🤖 AI Summary
This paper conducts a systematic study of Sibson’s α-mutual information—a Rényi-divergence-based generalization of mutual information. Addressing the lack of a unified variational representation and limited theoretical applicability, we derive the first general variational characterization of this measure. Leveraging this representation, we establish novel transportation-cost inequalities and Fano-type inequalities, substantially strengthening concentration analysis under dependence. Methodologically, the work integrates Rényi information theory, variational inference, and functional inequality theory. Key contributions include: (i) tight and universally applicable theoretical bounds; (ii) extended applications to statistical learning, hypothesis testing, Bayesian risk analysis, and universal prediction; and (iii) a unified analytical framework for α ≥ 1, bridging theoretical Rényi information measures with practical inference and learning tasks.
📝 Abstract
Information measures can be constructed from R'enyi divergences much like mutual information from Kullback-Leibler divergence. One such information measure is known as Sibson's $alpha$-mutual information and has received renewed attention recently in several contexts: concentration of measure under dependence, statistical learning, hypothesis testing, and estimation theory. In this paper, we survey and extend the state of the art. In particular, we introduce variational representations for Sibson's $alpha$-mutual information and employ them in each of the contexts just described to derive novel results. Namely, we produce generalized Transportation-Cost inequalities and Fano-type inequalities. We also present an overview of known applications, spanning from learning theory and Bayesian risk to universal prediction.