Inexact subgradient methods for semialgebraic functions

📅 2024-04-30
📈 Citations: 2
Influential: 1
📄 PDF
🤖 AI Summary
This paper investigates the convergence of the subgradient method for nonconvex semialgebraic optimization under persistent additive errors in subgradient evaluations, where each evaluation incurs a fixed accuracy error of magnitude ε. For the first time in the nonconvex semialgebraic setting, it establishes an explicit geometric dependence between the error magnitude and the resulting convergence accuracy: iterates ultimately oscillate within an O(ε^ρ)-neighborhood of the critical set. Methodologically, the analysis integrates nonsmooth analysis, semialgebraic geometry, a Lyapunov-type descent lemma, and an invariance principle under vanishing step sizes, yielding a unified framework applicable to both constant and diminishing step sizes. Key contributions include: (1) an O(ε^ρ) approximation guarantee for the limiting distance to the critical set; (2) improved average iteration complexity bounds—O(1/√k) and O(1/k)—for convex semialgebraic objectives; and (3) a novel descent lemma and characterization of asymptotic sequence behavior applicable to general nonconvex functions.

Technology Category

Application Category

📝 Abstract
Motivated by the extensive application of approximate gradients in machine learning and optimization, we investigate inexact subgradient methods subject to persistent additive errors. Within a nonconvex semialgebraic framework, assuming boundedness or coercivity, we establish that the method yields iterates that eventually fluctuate near the critical set at a proximity characterized by an $O(epsilon^ ho)$ distance, where $epsilon$ denotes the magnitude of subgradient evaluation errors, and $ ho$ encapsulates geometric characteristics of the underlying problem. Our analysis comprehensively addresses both vanishing and constant step-size regimes. Notably, the latter regime inherently enlarges the fluctuation region, yet this enlargement remains on the order of $epsilon^ ho$. In the convex scenario, employing a universal error bound applicable to coercive semialgebraic functions, we derive novel complexity results concerning averaged iterates. Additionally, our study produces auxiliary results of independent interest, including descent-type lemmas for nonsmooth nonconvex functions and an invariance principle governing the behavior of algorithmic sequences under small-step limits.
Problem

Research questions and friction points this paper is trying to address.

Investigates inexact subgradient methods with persistent additive errors
Analyzes convergence near critical set under nonconvex semialgebraic framework
Derives complexity results for convex coercive semialgebraic functions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Inexact subgradient methods with persistent errors
Nonconvex semialgebraic framework with O(ε^ρ) bounds
Complexity results for convex coercive semialgebraic functions
🔎 Similar Papers
No similar papers found.