Smooth Quasar-Convex Optimization with Constraints

📅 2025-10-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work studies first-order optimization of γ-star-convex smooth functions subject to general convex constraints, aiming to design accelerated algorithms achieving near-optimal query complexity. To this end, we propose the Approximate Accelerated Proximal Extrapolation Point (A2PEP) framework—the first such method applicable to general convex constraints—attaining a first-order oracle complexity of $widetilde{O}(1/(gammasqrt{varepsilon}))$, which is nearly optimal. Our approach unifies acceleration principles with first-order strategies including projected gradient descent and Frank–Wolfe methods, and provides rigorous convergence analysis under non-convex star-convex geometry. Theoretically, this work fills a long-standing gap in accelerated algorithms for constrained star-convex optimization, substantially improving upon existing Riemannian acceleration bounds. Moreover, it delivers tight theoretical guarantees for practical non-convex problems such as linear dynamical system identification and generalized linear model training.

Technology Category

Application Category

📝 Abstract
Quasar-convex functions form a broad nonconvex class with applications to linear dynamical systems, generalized linear models, and Riemannian optimization, among others. Current nearly optimal algorithms work only in affine spaces due to the loss of one degree of freedom when working with general convex constraints. Obtaining an accelerated algorithm that makes nearly optimal $widetilde{O}(1/(γsqrtε))$ first-order queries to a $γ$-quasar convex smooth function emph{with constraints} was independently asked as an open problem in Martínez-Rubio (2022); Lezane, Langer, and Koolen (2024). In this work, we solve this question by designing an inexact accelerated proximal point algorithm that we implement using a first-order method achieving the aforementioned rate and, as a consequence, we improve the complexity of the accelerated geodesically Riemannian optimization solution in Martínez-Rubio (2022). We also analyze projected gradient descent and Frank-Wolfe algorithms in this constrained quasar-convex setting. To the best of our knowledge, our work provides the first analyses of first-order methods for quasar-convex smooth functions with general convex constraints.
Problem

Research questions and friction points this paper is trying to address.

Develop accelerated algorithm for constrained quasar-convex optimization
Address open problem of nearly optimal first-order queries with constraints
Provide first analysis of first-order methods with convex constraints
Innovation

Methods, ideas, or system contributions that make the work stand out.

Inexact accelerated proximal point algorithm implementation
First-order method achieving nearly optimal query rate
Analysis of projected gradient descent and Frank-Wolfe
🔎 Similar Papers
No similar papers found.