Distributional Sensitivity Analysis: Enabling Differentiability in Sample-Based Inference

📅 2025-08-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of estimating gradients of distribution parameters for random vectors in black-box simulators or expensive physical simulations. We propose Distributional Sensitivity Analysis (DistroSA), the first method to leverage derivatives of the conditional distribution inverse mapping for arbitrary-dimensional random vectors, combined with diagonal Jacobian approximation and four second-order numerical algorithms. DistroSA enables differentiable inference without requiring explicit model knowledge, prior sampling mechanisms, or high-dimensional integration. The resulting differentiable sampling subroutine framework is compatible with automatic differentiation and deep learning platforms, supporting efficient gradient propagation even when closed-form solutions are unavailable. Experiments validate its theoretical correctness and numerical robustness, demonstrating successful application to uncertainty quantification and parameter inversion for nuclear physics quantum correlation functions. The open-source DistroSA package ensures full reproducibility.

Technology Category

Application Category

📝 Abstract
We present two analytical formulae for estimating the sensitivity -- namely, the gradient or Jacobian -- at given realizations of an arbitrary-dimensional random vector with respect to its distributional parameters. The first formula interprets this sensitivity as partial derivatives of the inverse mapping associated with the vector of 1-D conditional distributions. The second formula, intended for optimization methods that tolerate inexact gradients, introduces a diagonal approximation that reduces computational cost at the cost of some accuracy. We additionally provide four second-order numerical algorithms to approximate both formulae when closed forms are unavailable. We performed verification and validation studies to demonstrate the correctness of these numerical algorithms and the effectiveness of the proposed formulae. A nuclear physics application showcases how our work enables uncertainty quantification and parameter inference for quantum correlation functions. Our approach differs from existing methods by avoiding the need for model fitting, knowledge of sampling algorithms, and evaluation of high-dimensional integrals. It is therefore particularly useful for sample-based inverse problems when the sampler operates as a black box or requires expensive physics simulations. Moreover, our method renders arbitrary sampling subroutines differentiable, facilitating their integration into programming frameworks for deep learning and automatic differentiation. Algorithmic details and code implementations are provided in this paper and in our open-source software DistroSA to enable reproducibility and further development.
Problem

Research questions and friction points this paper is trying to address.

Estimating gradients of random vectors with respect to distributional parameters
Enabling differentiation in sample-based inference without model fitting
Facilitating uncertainty quantification in quantum correlation functions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Estimates gradients via inverse mapping derivatives
Introduces diagonal approximation for efficiency
Enables differentiation in black-box samplers
🔎 Similar Papers
No similar papers found.