BayesSum: Bayesian Quadrature in Discrete Spaces

📅 2025-12-17
📈 Citations: 0
✨ Influential: 0
📄 PDF
🤖 AI Summary
Estimating intractable expectations over discrete domains remains challenging, as conventional Monte Carlo methods—though consistent—suffer from low sample efficiency. This paper introduces the first Bayesian quadrature framework tailored to discrete spaces, extending Bayesian quadrature to discrete domains for the first time. We model the integrand using a Gaussian process prior and design positive-definite kernels specifically adapted to discrete structures. Crucially, we establish theoretical convergence guarantees, proving that the estimation error decays at a rate strictly faster than standard Monte Carlo. Empirical evaluation on synthetic benchmarks and real-world inference tasks—including parameter estimation for the Conway–Maxwell–Poisson and Potts models—demonstrates that our method achieves comparable or superior accuracy with significantly fewer samples. These results validate both the sample efficiency and practical applicability of the proposed framework.

Technology Category

Application Category

📝 Abstract
This paper addresses the challenging computational problem of estimating intractable expectations over discrete domains. Existing approaches, including Monte Carlo and Russian Roulette estimators, are consistent but often require a large number of samples to achieve accurate results. We propose a novel estimator, emph{BayesSum}, which is an extension of Bayesian quadrature to discrete domains. It is more sample efficient than alternatives due to its ability to make use of prior information about the integrand through a Gaussian process. We show this through theory, deriving a convergence rate significantly faster than Monte Carlo in a broad range of settings. We also demonstrate empirically that our proposed method does indeed require fewer samples on several synthetic settings as well as for parameter estimation for Conway-Maxwell-Poisson and Potts models.
Problem

Research questions and friction points this paper is trying to address.

Estimating intractable expectations over discrete domains
Improving sample efficiency compared to Monte Carlo methods
Using Gaussian process priors for better integration accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian quadrature extended to discrete domains
Uses Gaussian process for prior information integration
Achieves faster convergence than Monte Carlo methods
🔎 Similar Papers
No similar papers found.
S
Sophia Seulkee Kang
Independent Researcher
François-Xavier Briol
François-Xavier Briol
Professor of Statistics and Machine Learning, UCL
Bayesian ComputationStatistical Machine LearningComputational StatisticsRobustness
T
Toni Karvonen
Lappeenranta–Lahti University of Technology LUT
Z
Zonghao Chen
University College London