Extragradient Method for $(L_0, L_1)$-Lipschitz Root-finding Problems

📅 2025-10-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing extragradient (EG) methods for minimax optimization, root-finding, and variational inequalities rely on the restrictive strong $L$-Lipschitz assumption, failing to capture complex operator structures arising in modern machine learning. Method: We propose the first adaptive step-size strategy based on dynamic operator-norm estimation, operating under the significantly milder $alpha$-symmetric $(L_0, L_1)$-Lipschitz condition. Contribution/Results: We establish sublinear convergence for monotone operators and linear convergence for strongly monotone operators. Moreover, we provide the first local convergence guarantee under the weak Minty condition—a substantially weaker requirement than standard monotonicity. Experiments demonstrate that our method achieves robust, practical convergence across diverse non-standard operator settings, markedly broadening the applicability of extragradient methods beyond classical assumptions.

Technology Category

Application Category

📝 Abstract
Introduced by Korpelevich in 1976, the extragradient method (EG) has become a cornerstone technique for solving min-max optimization, root-finding problems, and variational inequalities (VIs). Despite its longstanding presence and significant attention within the optimization community, most works focusing on understanding its convergence guarantees assume the strong L-Lipschitz condition. In this work, building on the proposed assumptions by Zhang et al. [2024b] for minimization and Vankov et al.[2024] for VIs, we focus on the more relaxed $α$-symmetric $(L_0, L_1)$-Lipschitz condition. This condition generalizes the standard Lipschitz assumption by allowing the Lipschitz constant to scale with the operator norm, providing a more refined characterization of problem structures in modern machine learning. Under the $α$-symmetric $(L_0, L_1)$-Lipschitz condition, we propose a novel step size strategy for EG to solve root-finding problems and establish sublinear convergence rates for monotone operators and linear convergence rates for strongly monotone operators. Additionally, we prove local convergence guarantees for weak Minty operators. We supplement our analysis with experiments validating our theory and demonstrating the effectiveness and robustness of the proposed step sizes for EG.
Problem

Research questions and friction points this paper is trying to address.

Extends extragradient method to relaxed Lipschitz root-finding problems
Proposes novel step size strategy for generalized Lipschitz conditions
Establishes convergence rates for monotone and strongly monotone operators
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extragradient method with novel step size strategy
Relaxed alpha-symmetric Lipschitz condition application
Convergence guarantees for monotone and strongly monotone operators
🔎 Similar Papers
No similar papers found.