High Probability Complexity Bounds of Trust-Region Stochastic Sequential Quadratic Programming with Heavy-Tailed Noise

📅 2025-03-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper studies nonlinear optimization with a stochastic objective and deterministic equality constraints, aiming to efficiently compute first- and second-order ε-stationary points using only zero-, first-, or second-order probabilistic oracles—where zero-order evaluations are corrupted by irreducible heavy-tailed noise. To this end, we propose the Trust-Region Stochastic Sequential Quadratic Programming (TR-SSQP) algorithm. Our work establishes, for the first time within the SSQP framework, high-probability iteration complexity bounds of O(ε⁻²) for first-order and O(ε⁻³) for second-order ε-stationarity. Crucially, we extend the theoretical analysis to the heavy-tailed noise regime and prove optimality of these bounds. Empirical validation on the CUTEst benchmark suite confirms both the efficacy and robustness of TR-SSQP under stochastic and heavy-tailed settings.

Technology Category

Application Category

📝 Abstract
In this paper, we consider nonlinear optimization problems with a stochastic objective and deterministic equality constraints. We propose a Trust-Region Stochastic Sequential Quadratic Programming (TR-SSQP) method and establish its high-probability iteration complexity bounds for identifying first- and second-order $epsilon$-stationary points. In our algorithm, we assume that exact objective values, gradients, and Hessians are not directly accessible but can be estimated via zeroth-, first-, and second-order probabilistic oracles. Compared to existing complexity studies of SSQP methods that rely on a zeroth-order oracle with sub-exponential tail noise (i.e., light-tailed) and focus mostly on first-order stationarity, our analysis accommodates irreducible and heavy-tailed noise in the zeroth-order oracle and significantly extends the analysis to second-order stationarity. We show that under weaker noise conditions, our method achieves the same high-probability first-order iteration complexity bounds, while also exhibiting promising second-order iteration complexity bounds. Specifically, the method identifies a first-order $epsilon$-stationary point in $mathcal{O}(epsilon^{-2})$ iterations and a second-order $epsilon$-stationary point in $mathcal{O}(epsilon^{-3})$ iterations with high probability, provided that $epsilon$ is lower bounded by a constant determined by the irreducible noise level in estimation. We validate our theoretical findings and evaluate the practical performance of our method on CUTEst benchmark test set.
Problem

Research questions and friction points this paper is trying to address.

Solves nonlinear optimization with stochastic objective and deterministic constraints
Handles heavy-tailed noise in zeroth-order oracle for gradient and Hessian estimation
Achieves high-probability iteration complexity bounds for first- and second-order stationarity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Trust-Region Stochastic Sequential Quadratic Programming method
Handles heavy-tailed noise in zeroth-order oracle
Achieves high-probability first and second-order complexity bounds
🔎 Similar Papers
No similar papers found.