Low Stein Discrepancy via Message-Passing Monte Carlo

📅 2025-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address high-quality sampling from general multivariate probability distributions, this paper proposes Stein-MPMC—a novel method that integrates the Message-Passing Monte Carlo (MPMC) framework with kernelized Stein discrepancy (KSD) optimization for the first time. Stein-MPMC dynamically adjusts non-uniform sampling weights via a message-passing mechanism, minimizing KSD without requiring distributional uniformity assumptions—thus overcoming a key limitation of conventional MPMC, which is restricted to uniform distributions. Technically, the approach unifies geometric deep learning, message-passing neural networks, and Monte Carlo optimization to enable efficient, low-discrepancy sampling from arbitrary distributions with known density functions. Extensive benchmark experiments demonstrate that Stein-MPMC achieves significantly lower KSD than state-of-the-art methods, including Stein variational gradient descent and greedy Stein point selection.

Technology Category

Application Category

📝 Abstract
Message-Passing Monte Carlo (MPMC) was recently introduced as a novel low-discrepancy sampling approach leveraging tools from geometric deep learning. While originally designed for generating uniform point sets, we extend this framework to sample from general multivariate probability distributions with known probability density function. Our proposed method, Stein-Message-Passing Monte Carlo (Stein-MPMC), minimizes a kernelized Stein discrepancy, ensuring improved sample quality. Finally, we show that Stein-MPMC outperforms competing methods, such as Stein Variational Gradient Descent and (greedy) Stein Points, by achieving a lower Stein discrepancy.
Problem

Research questions and friction points this paper is trying to address.

Extends MPMC to sample general multivariate distributions
Minimizes kernelized Stein discrepancy for better samples
Outperforms Stein Variational Gradient Descent methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends MPMC to general multivariate distributions
Minimizes kernelized Stein discrepancy
Outperforms Stein Variational Gradient Descent
🔎 Similar Papers
No similar papers found.