Adaptive Conditional Gradient Descent

📅 2025-10-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of step-size selection in first-order optimization over non-Euclidean spaces, this paper proposes an adaptive step-size strategy based on a linear minimization oracle. Methodologically, it introduces a novel heuristic for estimating local Lipschitz constants, unifying conditional gradient methods and non-Euclidean normalized steepest descent for non-convex, quasiconvex, and strongly convex objectives. Theoretically, we provide rigorous convergence analysis, achieving state-of-the-art rates—namely, $O(1/k)$ for general non-convex or quasiconvex functions and $O(e^{-k})$ for strongly convex ones. Experimentally, the algorithm demonstrates significantly improved stability and efficiency across diverse non-Euclidean optimization tasks, consistently outperforming both adaptive and fixed-step baselines.

Technology Category

Application Category

📝 Abstract
Selecting an effective step-size is a fundamental challenge in first-order optimization, especially for problems with non-Euclidean geometries. This paper presents a novel adaptive step-size strategy for optimization algorithms that rely on linear minimization oracles, as used in the Conditional Gradient or non-Euclidean Normalized Steepest Descent algorithms. Using a simple heuristic to estimate a local Lipschitz constant for the gradient, we can determine step-sizes that guarantee sufficient decrease at each iteration. More precisely, we establish convergence guarantees for our proposed Adaptive Conditional Gradient Descent algorithm, which covers as special cases both the classical Conditional Gradient algorithm and non-Euclidean Normalized Steepest Descent algorithms with adaptive step-sizes. Our analysis covers optimization of continuously differentiable functions in non-convex, quasar-convex, and strongly convex settings, achieving convergence rates that match state-of-the-art theoretical bounds. Comprehensive numerical experiments validate our theoretical findings and illustrate the practical effectiveness of Adaptive Conditional Gradient Descent. The results exhibit competitive performance, underscoring the potential of the adaptive step-size for applications.
Problem

Research questions and friction points this paper is trying to address.

Adaptive step-size strategy for non-Euclidean optimization problems
Convergence guarantees for Conditional Gradient algorithms with adaptive steps
Optimization of differentiable functions in convex and non-convex settings
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive step-size strategy using local Lipschitz constant
Convergence guarantees for non-convex and convex settings
Linear minimization oracles with heuristic gradient estimation