🤖 AI Summary
This work addresses hierarchical variational inequality problems involving smooth operators with a two-level structure and finite-sum form, encompassing a broad class of optimization tasks such as function minimization, saddle-point problems, and Nash equilibria. The paper proposes the first variance-reduced stochastic extragradient algorithm tailored to this setting, offering a unified treatment of the intricate bilevel structure under both Euclidean and Bregman distance frameworks. The theoretical analysis establishes, for the first time, convergence rates and sample complexity bounds for this class of problems, significantly extending the theoretical foundations of variational inequality solvers. This advancement provides an efficient, unified algorithmic framework with rigorous guarantees for a wide range of optimization problems.
📝 Abstract
We are concerned with optimization in a broad sense through the lens of solving variational inequalities (VIs) -- a class of problems that are so general that they cover as particular cases minimization of functions, saddle-point (minimax) problems, Nash equilibrium problems, and many others. The key challenges in our problem formulation are the two-level hierarchical structure and finite-sum representation of the smooth operators in each level. For this setting, we are the first to prove convergence rates and complexity statements for variance-reduced stochastic algorithms approaching the solution of hierarchical VIs in Euclidean and Bregman setups.