🤖 AI Summary
Monte Carlo estimation of Bayesian posterior expectations enjoys asymptotic guarantees but fails to exploit integrand smoothness; Stein equation-based approaches can incorporate such smoothness, yet existing methods require solving large dense linear systems, incurring prohibitive computational cost. This paper proposes a fast approximate algorithm for the Stein equation based on iterative solvers and adaptive preconditioning. We systematically evaluate diverse preconditioning strategies—including incomplete factorizations and low-rank approximations—under the Stein operator, revealing their problem-dependent performance and establishing that no universally optimal strategy exists; this provides theoretical grounding for adaptive selection. Integrating iterative solvers such as conjugate gradient avoids explicit construction and direct inversion of large matrices. The method achieves speedups of several-fold to over an order of magnitude while preserving statistical accuracy, significantly enhancing efficiency for high-dimensional, smooth posterior expectation estimation.
📝 Abstract
Bayesian inference is conceptually elegant, but calculating posterior expectations can entail a heavy computational cost. Monte Carlo methods are reliable and supported by strong asymptotic guarantees, but do not leverage smoothness of the integrand. Solving Stein equations has emerged as a possible alternative, providing a framework for numerical approximation of posterior expectations in which smoothness can be exploited. However, existing numerical methods for Stein equations are associated with high computational cost due to the need to solve large linear systems. This paper considers the combination of iterative linear solvers and preconditioning strategies to obtain fast approximate solutions of Stein equations. Our main finding is that these methods can be effective, but that the performance of different preconditioning strategies is context-dependent, so that no single strategy can be universally preferred.