🤖 AI Summary
This work addresses the computability of $f$-divergences between probability measures within a generalized quasi-$varepsilon$-neighborhood. We propose a generalized local information geometry framework that integrates higher-order Taylor expansions with local perturbation analysis. First, we derive a novel inverse Pinsker-type upper bound applicable to a broad class of $f$-divergences, explicitly characterizing their dependence on total variation distance. Second, under the assumption that $f$ is thrice continuously differentiable, we establish tight upper and lower bounds on the ratio of any two $f$-divergences within the neighborhood, proving both its boundedness and asymptotic equivalence. These results unify and substantially extend the classical Pinsker inequality, providing new theoretical tools and quantitative bounds for robust statistical inference, stability analysis of generative models, and distributional shift modeling.
📝 Abstract
A general reverse Pinsker's inequality is derived to give an upper bound on f-divergences in terms of total variational distance when two distributions are close measured under our proposed generalized local information geometry framework. In addition, relationships between two f-divergences equipped with functions that are third order differentiable are established in terms of the lower and upper bounds of their ratio, when the underlying distributions are within a generalized quasi-$varepsilon$-neighborhood.