🤖 AI Summary
This work addresses the limitations of existing spherical-radial decomposition (SRD) methods, which are confined to finite-dimensional ellipsoidal distributions and struggle to provide efficient, unbiased estimates of probability functions and their gradients in infinite-dimensional random spaces due to truncation-induced bias and high variance. To overcome these challenges, we propose a hybrid infinite-dimensional SRD (hiSRD) framework that integrates subspace projection, Monte Carlo sampling, and infinite-dimensional SRD, thereby achieving—for the first time—an unbiased, low-variance extension of SRD to infinite-dimensional settings. The method enables accurate computation of probability derivatives under joint chance constraints, transcending conventional dimensional barriers. Numerical experiments in risk-neutral stochastic PDE optimal control and Gaussian process kernel parameter optimization demonstrate that hiSRD substantially reduces estimation variance while precisely satisfying chance constraints, confirming its efficacy and scalability.
📝 Abstract
The spherical-radial decomposition (SRD) is an efficient method for estimating probabilistic functions and their gradients defined over finite-dimensional elliptical distributions. In this work, we generalize the SRD to infinite stochastic dimensions by combining subspace SRD with standard Monte Carlo methods. The resulting method, which we call hybrid infinite-dimensional SRD (hiSRD) provides an unbiased, low-variance estimator for convex sets arising, for instance, in chance-constrained optimization. We provide a theoretical analysis of the variance of finite-dimensional SRD as the dimension increases, and show that the proposed hybrid method eliminates truncation-induced bias, reduces variance, and allows the computation of derivatives of probabilistic functions. We present comprehensive numerical studies for a risk-neutral stochastic PDE optimal control problem with joint chance state constraints, and for optimizing kernel parameters in Gaussian process regression under the constraint that the posterior process satisfies joint chance constraints.