🤖 AI Summary
This work addresses the long-standing bottleneck in second-order oracle complexity for convex–concave minimax optimization. We present the first systematic extension of optimal second-order convex optimization techniques to the minimax setting, introducing a second-order Catalyst acceleration framework. This framework integrates Newton-type updates, lazy Hessian approximations, and globally convergent second-order generalizations. It breaks the conjectured tight bound of $mathcal{O}(epsilon^{-2/3})$ on second-order oracle calls. Our theoretical analysis establishes that the proposed algorithm achieves a complexity of $ ilde{mathcal{O}}(epsilon^{-4/7})$ second-order oracle evaluations—strictly improving upon all prior state-of-the-art results. The method provides a more efficient higher-order optimization paradigm for nonsmooth and strongly structured minimax problems, advancing both theoretical understanding and practical solvability in this fundamental class of nonconvex–nonconcave optimization.
📝 Abstract
Previous algorithms can solve convex-concave minimax problems $min_{x in mathcal{X}} max_{y in mathcal{Y}} f(x,y)$ with $mathcal{O}(epsilon^{-2/3})$ second-order oracle calls using Newton-type methods. This result has been speculated to be optimal because the upper bound is achieved by a natural generalization of the optimal first-order method. In this work, we show an improved upper bound of $ ilde{mathcal{O}}(epsilon^{-4/7})$ by generalizing the optimal second-order method for convex optimization to solve the convex-concave minimax problem. We further apply a similar technique to lazy Hessian algorithms and show that our proposed algorithm can also be seen as a second-order ``Catalyst'' framework (Lin et al., JMLR 2018) that could accelerate any globally convergent algorithms for solving minimax problems.