On exploration of an interior mirror descent flow for stochastic nonconvex constrained problem

๐Ÿ“… 2025-07-21
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This paper studies nonsmooth nonconvex optimization over the intersection of the closure of an open feasible set and a smooth manifold. To address interior-point optimization under nonconvex constraints, we propose a subgradient flow framework induced by barrier functions that define a Riemannian metricโ€”unifying, for the first time, the dynamical essences of Hessian barrier methods and mirror descent. We characterize the geometric origins of spurious stationary points common to both methods, derive sufficient conditions to avoid them, and introduce a stochastic perturbation strategy. Leveraging differential inclusions, we construct a continuous-time dynamical system and design a novel Riemannian subgradient algorithm. We establish theoretical convergence of algorithmic trajectories to approximate stationary points, significantly broadening the applicability and convergence guarantees of existing methods for nonconvex constrained optimization. This work establishes the first unified geometric framework for nonconvex interior-point methods.

Technology Category

Application Category

๐Ÿ“ Abstract
We study a nonsmooth nonconvex optimization problem defined over nonconvex constraints, where the feasible set is given by the intersection of the closure of an open set and a smooth manifold. By endowing the open set with a Riemannian metric induced by a barrier function, we obtain a Riemannian subgradient flow formulated as a differential inclusion, which remains strictly within the interior of the feasible set. This continuous dynamical system unifies two classes of iterative optimization methods, namely the Hessian barrier method and mirror descent scheme, by revealing that these methods can be interpreted as discrete approximations of the continuous flow. We explore the long-term behavior of the trajectories generated by this dynamical system and show that the existing deficient convergence properties of the Hessian barrier and mirror descent scheme can be unifily and more insightfully interpreted through these of the continuous trajectory. For instance, the notorious spurious stationary points cite{chen2024spurious} observed in Hessian barrier method and mirror descent scheme are interpreted as stable equilibria of the dynamical system that do not correspond to real stationary points of the original optimization problem. We provide two sufficient condition such that these spurious stationary points can be avoided if the strict complementarity conditions holds. In the absence of these regularity condition, we propose a random perturbation strategy that ensures the trajectory converges (subsequentially) to an approximate stationary point. Building on these insights, we introduce two iterative Riemannian subgradient methods, form of interior point methods, that generalizes the existing Hessian barrier method and mirror descent scheme for solving nonsmooth nonconvex optimization problems.
Problem

Research questions and friction points this paper is trying to address.

Explores interior mirror descent flow for stochastic nonconvex constrained problems
Unifies Hessian barrier method and mirror descent via continuous dynamical system
Addresses spurious stationary points in nonconvex optimization with new conditions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Riemannian subgradient flow for nonconvex constraints
Unifies Hessian barrier and mirror descent methods
Random perturbation avoids spurious stationary points
๐Ÿ”Ž Similar Papers
No similar papers found.