Bounds on f-Divergences between Distributions within Generalized Quasi-ε-Neighborhood

📅 2024-06-03
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the computability of $f$-divergences between probability measures within a generalized quasi-$varepsilon$-neighborhood. We propose a generalized local information geometry framework that integrates higher-order Taylor expansions with local perturbation analysis. First, we derive a novel inverse Pinsker-type upper bound applicable to a broad class of $f$-divergences, explicitly characterizing their dependence on total variation distance. Second, under the assumption that $f$ is thrice continuously differentiable, we establish tight upper and lower bounds on the ratio of any two $f$-divergences within the neighborhood, proving both its boundedness and asymptotic equivalence. These results unify and substantially extend the classical Pinsker inequality, providing new theoretical tools and quantitative bounds for robust statistical inference, stability analysis of generative models, and distributional shift modeling.

Technology Category

Application Category

📝 Abstract
A general reverse Pinsker's inequality is derived to give an upper bound on f-divergences in terms of total variational distance when two distributions are close measured under our proposed generalized local information geometry framework. In addition, relationships between two f-divergences equipped with functions that are third order differentiable are established in terms of the lower and upper bounds of their ratio, when the underlying distributions are within a generalized quasi-$varepsilon$-neighborhood.
Problem

Research questions and friction points this paper is trying to address.

Establish bounds for f-divergences in generalized quasi-ε-neighborhoods
Unify local distribution proximity beyond structural constraints
Provide tighter reverse Pinsker's inequalities for computable bounds
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unified local distribution proximity characterization
Taylor-based differentiable f-divergence classification
Tighter reverse Pinsker inequalities introduced
X
Xinchun Yu
Shenzhen Key Laboratory of Ubiquitous Data Enabling, Shenzhen International Graduate School, Tsinghua University, Shenzhen, 518055, China
Shuangqing Wei
Shuangqing Wei
School of EECS, Louisiana State University
Communication TheoryInformation Theoryand Statistical Inference
Shao-Lun Huang
Shao-Lun Huang
T-SIGS, Tsinghua University
Information TheoryMachine learning
X
Xiao-Ping Zhang
Shenzhen Key Laboratory of Ubiquitous Data Enabling, Shenzhen International Graduate School, Tsinghua University, Shenzhen, 518055, China