Isotropic Noise in Stochastic and Quantum Convex Optimization

📅 2025-10-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work studies stochastic and quantum convex optimization under isotropic noise: minimizing $d$-dimensional Lipschitz convex functions given stochastic gradients whose noise is uniformly bounded across all directions (with high probability). We formally introduce this isotropic noise model, establish matching upper and lower bounds, and design the first algorithm achieving the optimal stochastic gradient complexity of $O(d/varepsilon^2)$—improving upon prior results by a factor of $d$. Furthermore, we construct a “quantum isotropizer” that transforms standard variance-bounded quantum gradient queries into isotropic-error estimates, yielding a quantum query complexity of $O(d^{3/2}/varepsilon)$—a strict improvement over the best-known $O(d^2/varepsilon)$ rate. Our techniques integrate subexponential noise analysis, quantum sampling estimation, and classical–quantum co-design. The resulting algorithms achieve the current best dimension-dependent convergence rates for both stochastic and quantum convex optimization.

Technology Category

Application Category

📝 Abstract
We consider the problem of minimizing a $d$-dimensional Lipschitz convex function using a stochastic gradient oracle. We introduce and motivate a setting where the noise of the stochastic gradient is isotropic in that it is bounded in every direction with high probability. We then develop an algorithm for this setting which improves upon prior results by a factor of $d$ in certain regimes, and as a corollary, achieves a new state-of-the-art complexity for sub-exponential noise. We give matching lower bounds (up to polylogarithmic factors) for both results. Additionally, we develop an efficient quantum isotropifier, a quantum algorithm which converts a variance-bounded quantum sampling oracle into one that outputs an unbiased estimate with isotropic error. Combining our results, we obtain improved dimension-dependent rates for quantum stochastic convex optimization.
Problem

Research questions and friction points this paper is trying to address.

Minimizing Lipschitz convex functions using stochastic gradient oracles
Developing algorithms for isotropic noise in stochastic optimization
Improving quantum convex optimization with isotropic error bounds
Innovation

Methods, ideas, or system contributions that make the work stand out.

Isotropic noise setting for stochastic gradient optimization
Improved algorithm achieving d-factor speedup in regimes
Efficient quantum isotropifier enabling isotropic error estimates
🔎 Similar Papers
No similar papers found.