Stationary MMD Points for Cubature

📅 2025-05-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses deterministic approximation of a target probability distribution by a finite set of points, focusing on kernel-based numerical integration (cubature) via the Maximum Mean Discrepancy (MMD). To overcome the computational intractability of global MMD minimization, we propose instead targeting *MMD stationary points*. We establish, for the first time, their *super-convergence* property in the RKHS: the cubature error decays faster than the MMD itself. We derive the first non-asymptotic, finite-particle upper bound on the gradient flow error and design a practical discrete gradient flow algorithm, accompanied by rigorous theoretical guarantees. Our approach significantly improves accuracy and stability of high-dimensional numerical integration, offering a novel paradigm for quadrature, data compression, and optimization.

Technology Category

Application Category

📝 Abstract
Approximation of a target probability distribution using a finite set of points is a problem of fundamental importance, arising in cubature, data compression, and optimisation. Several authors have proposed to select points by minimising a maximum mean discrepancy (MMD), but the non-convexity of this objective precludes global minimisation in general. Instead, we consider emph{stationary} points of the MMD which, in contrast to points globally minimising the MMD, can be accurately computed. Our main theoretical contribution is the (perhaps surprising) result that, for integrands in the associated reproducing kernel Hilbert space, the cubature error of stationary MMD points vanishes emph{faster} than the MMD. Motivated by this emph{super-convergence} property, we consider discretised gradient flows as a practical strategy for computing stationary points of the MMD, presenting a refined convergence analysis that establishes a novel non-asymptotic finite-particle error bound, which may be of independent interest.
Problem

Research questions and friction points this paper is trying to address.

Approximating target distributions with finite point sets
Computing stationary MMD points for accurate cubature
Achieving super-convergence in cubature error rates
Innovation

Methods, ideas, or system contributions that make the work stand out.

Stationary MMD points for cubature approximation
Super-convergence in reproducing kernel Hilbert space
Discretised gradient flows for MMD stationary points
🔎 Similar Papers
No similar papers found.
Z
Zonghao Chen
University College London, UK
T
Toni Karvonen
Lappeenranta–Lahti University of Technology LUT, FI
Heishiro Kanagawa
Heishiro Kanagawa
Newcastle University
Machine LearningKernel MethodsGenerative Modelling
F
Franccois-Xavier Briol
University College London, UK
Chris. J. Oates
Chris. J. Oates
Newcastle University
Statistics