đ¤ AI Summary
Estimating intractable expectations over discrete domains remains challenging, as conventional Monte Carlo methodsâthough consistentâsuffer from low sample efficiency. This paper introduces the first Bayesian quadrature framework tailored to discrete spaces, extending Bayesian quadrature to discrete domains for the first time. We model the integrand using a Gaussian process prior and design positive-definite kernels specifically adapted to discrete structures. Crucially, we establish theoretical convergence guarantees, proving that the estimation error decays at a rate strictly faster than standard Monte Carlo. Empirical evaluation on synthetic benchmarks and real-world inference tasksâincluding parameter estimation for the ConwayâMaxwellâPoisson and Potts modelsâdemonstrates that our method achieves comparable or superior accuracy with significantly fewer samples. These results validate both the sample efficiency and practical applicability of the proposed framework.
đ Abstract
This paper addresses the challenging computational problem of estimating intractable expectations over discrete domains. Existing approaches, including Monte Carlo and Russian Roulette estimators, are consistent but often require a large number of samples to achieve accurate results. We propose a novel estimator, emph{BayesSum}, which is an extension of Bayesian quadrature to discrete domains. It is more sample efficient than alternatives due to its ability to make use of prior information about the integrand through a Gaussian process. We show this through theory, deriving a convergence rate significantly faster than Monte Carlo in a broad range of settings. We also demonstrate empirically that our proposed method does indeed require fewer samples on several synthetic settings as well as for parameter estimation for Conway-Maxwell-Poisson and Potts models.