Non-Euclidean High-Order Smooth Convex Optimization

📅 2024-11-13
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the efficient optimization of Hölder-continuous, $q$-times differentiable convex functions in non-Euclidean spaces, under arbitrary norms—including $ell_p$ norms for $1 leq p leq infty$—and with access to an inexact spherical optimization oracle. We propose a non-Euclidean inexact accelerated proximal point method coupled with inexact uniformly convex regularization. Our key contribution is the first information-theoretic lower bound for high-dimensional convex optimization that applies to general norms and all orders $q geq 1$, resolving a long-standing open problem in parallel convex optimization. The proposed algorithm achieves nearly optimal convergence rates, tightly matching our lower bound in both $ell_p$ settings and stochastic/parallel computation models. Notably, this is the first framework unifying high-order smooth convex optimization for all $q geq 1$, delivering both theoretical completeness and practical applicability.

Technology Category

Application Category

📝 Abstract
We develop algorithms for the optimization of convex objectives that have H""older continuous $q$-th derivatives by using a $q$-th order oracle, for any $q geq 1$. Our algorithms work for general norms under mild conditions, including the $ell_p$-settings for $1leq pleq infty$. We can also optimize structured functions that allow for inexactly implementing a non-Euclidean ball optimization oracle. We do this by developing a non-Euclidean inexact accelerated proximal point method that makes use of an emph{inexact uniformly convex regularizer}. We show a lower bound for general norms that demonstrates our algorithms are nearly optimal in high-dimensions in the black-box oracle model for $ell_p$-settings and all $q geq 1$, even in randomized and parallel settings. This new lower bound, when applied to the first-order smooth case, resolves an open question in parallel convex optimization.
Problem

Research questions and friction points this paper is trying to address.

Optimizing convex objectives with Hölder continuous derivatives
Developing algorithms for non-Euclidean high-order smooth optimization
Resolving open questions in parallel convex optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Non-Euclidean optimization algorithms
Inexact accelerated proximal method
Uniformly convex regularizer usage
🔎 Similar Papers
No similar papers found.
Juan Pablo Contreras
Juan Pablo Contreras
Institute for Mathematical and Computational Engineering, Pontificia Universidad Católica de Chile
C
Cristóbal Guzmán
Institute for Mathematical and Computational Engineering, Faculty of Mathematics and School of Engineering, Pontificia Universidad Católica de Chile
David Martínez-Rubio
David Martínez-Rubio
Carlos III University
OptimizationOnline LearningDeep Learning