Efficient Stochastic Optimisation via Sequential Monte Carlo

📅 2026-01-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of optimizing objectives with intractable gradients—such as maximum marginal likelihood estimation and generative model fine-tuning—by introducing, for the first time, a sequential Monte Carlo (SMC) sampler into a gradient-free optimization framework. The proposed approach replaces the conventional inner-loop sampling procedure to efficiently approximate stochastic gradients, substantially reducing computational overhead while preserving optimization performance. A theoretical convergence analysis is provided to support the method’s validity. Empirical evaluations across multiple energy-based models on reward tuning tasks demonstrate that the proposed algorithm achieves significant acceleration without compromising, and in some cases even improving, optimization quality.

Technology Category

Application Category

📝 Abstract
The problem of optimising functions with intractable gradients frequently arise in machine learning and statistics, ranging from maximum marginal likelihood estimation procedures to fine-tuning of generative models. Stochastic approximation methods for this class of problems typically require inner sampling loops to obtain (biased) stochastic gradient estimates, which rapidly becomes computationally expensive. In this work, we develop sequential Monte Carlo (SMC) samplers for optimisation of functions with intractable gradients. Our approach replaces expensive inner sampling methods with efficient SMC approximations, which can result in significant computational gains. We establish convergence results for the basic recursions defined by our methodology which SMC samplers approximate. We demonstrate the effectiveness of our approach on the reward-tuning of energy-based models within various settings.
Problem

Research questions and friction points this paper is trying to address.

intractable gradients
stochastic optimisation
function optimisation
machine learning
statistical inference
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sequential Monte Carlo
Stochastic Optimization
Intractable Gradients
SMC Samplers
Energy-Based Models
🔎 Similar Papers
No similar papers found.
J
James Cuin
Department of Mathematics, Imperial College London, London, United Kingdom
Davide Carbone
Davide Carbone
École Normale Supérieure, LPENS
Statistical MechanicsMachine Learning
Y
Yanbo Tang
Department of Mathematics, Imperial College London, London, United Kingdom
O. Deniz Akyildiz
O. Deniz Akyildiz
Imperial
computational statisticsgenerative modelsoptimization