Contraction of R\'enyi Divergences for Discrete Channels: Properties and Applications

📅 2026-01-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the strong data processing inequality (SDPI) properties of Rényi divergence under discrete channels and its distinction from φ-divergences. By analyzing the contraction behavior across different orders α, the study reveals— for the first time—that Rényi divergence exhibits markedly different contraction characteristics from φ-divergences when α > 1. It further establishes an intrinsic connection between the ∞-Rényi divergence and ε-local differential privacy. Integrating tools from Rényi divergence theory, strong data processing constants, L^α norm contraction, and differential privacy, the paper develops a novel theoretical framework for characterizing Rényi divergence contraction. This framework not only yields tighter upper bounds on the convergence rate of Markov chains but also deepens the understanding of the interplay between information contraction and privacy-preserving mechanisms.

Technology Category

Application Category

📝 Abstract
This work explores properties of Strong Data-Processing constants for R\'enyi Divergences. Parallels are made with the well-studied $\varphi$-Divergences, and it is shown that the order $\alpha$ of R\'enyi Divergences dictates whether certain properties of the contraction of $\varphi$-Divergences are mirrored or not. In particular, we demonstrate that when $\alpha>1$, the contraction properties can deviate quite strikingly from those of $\varphi$-Divergences. We also uncover specific characteristics of contraction for the $\infty$-R\'enyi Divergence and relate it to $\varepsilon$-Local Differential Privacy. The results are then applied to bound the speed of convergence of Markov chains, where we argue that the contraction of R\'enyi Divergences offers a new perspective on the contraction of $L^\alpha$-norms commonly studied in the literature.
Problem

Research questions and friction points this paper is trying to address.

Rényi Divergence
Contraction
Data-Processing Inequality
Local Differential Privacy
Markov Chains
Innovation

Methods, ideas, or system contributions that make the work stand out.

Rényi Divergence
Strong Data-Processing Inequality
Contraction Coefficient
Local Differential Privacy
Markov Chain Convergence
🔎 Similar Papers