A Minimalist Bayesian Framework for Stochastic Optimization

📅 2025-09-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Bayesian stochastic optimization often struggles to incorporate structural constraints due to the need to model all parameters jointly. To address this, we propose MINTS (Minimalist Bayesian Optimization), a parsimonious Bayesian framework that places priors only on key quantities—such as the location of the optimal solution—and marginalizes out nuisance parameters via profile likelihood. This enables natural integration of convex constraints and sequential decision-making. MINTS is the first framework to reinterpret Thompson sampling through the lens of classical convex optimization, yielding novel theoretical insights and extending its applicability to structured problems including Lipschitz bandits and dynamic pricing. The resulting algorithm achieves near-optimal regret bounds in continuous action spaces while maintaining computational efficiency and rigorous theoretical guarantees. MINTS thus provides the first scalable, theoretically sound Bayesian solution for structured stochastic optimization.

Technology Category

Application Category

📝 Abstract
The Bayesian paradigm offers principled tools for sequential decision-making under uncertainty, but its reliance on a probabilistic model for all parameters can hinder the incorporation of complex structural constraints. We introduce a minimalist Bayesian framework that places a prior only on the component of interest, such as the location of the optimum. Nuisance parameters are eliminated via profile likelihood, which naturally handles constraints. As a direct instantiation, we develop a MINimalist Thompson Sampling (MINTS) algorithm. Our framework accommodates structured problems, including continuum-armed Lipschitz bandits and dynamic pricing. It also provides a probabilistic lens on classical convex optimization algorithms such as the center of gravity and ellipsoid methods. We further analyze MINTS for multi-armed bandits and establish near-optimal regret guarantees.
Problem

Research questions and friction points this paper is trying to address.

Minimalist Bayesian framework for stochastic optimization
Handles structural constraints via profile likelihood
Develops MINTS algorithm with regret guarantees
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian framework with profile likelihood
MINTS algorithm for structured problems
Probabilistic lens on convex optimization
🔎 Similar Papers
No similar papers found.