🤖 AI Summary
This work investigates the variational sensitivity of expected function values under perturbations of underlying probability measures. Employing functional variational analysis and information-geometric techniques, we derive, for the first time, a universal closed-form expression for the expectation’s functional derivative with respect to the measure. We rigorously establish its analytical equivalence to mutual information, lautum information, and the Gibbs measure. This formula unifies the mechanistic characterization of how measure perturbations affect expectations and extends the variational interpretation of Gibbs measures. In statistical inference and Bayesian sensitivity analysis, it enables interpretable and computationally tractable quantification of measure dependence—thereby providing both theoretical foundations and practical tools for assessing the robustness of probabilistic models.
📝 Abstract
Closed-form expressions are presented for the variation of the expectation of a given function due to changes in the probability measure used for the expectation. They unveil interesting connections with Gibbs probability measures, the mutual information, and the lautum information.