Primal Methods for Variational Inequality Problems with Functional Constraints

📅 2024-03-19
🏛️ Mathematical programming
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
This paper studies variational inequality (VI) problems with multiple functional constraints. Conventional first-order methods rely on expensive projection or linear minimization oracles, while existing primal-dual algorithms require prior knowledge of optimal Lagrange multipliers. To address these limitations, we propose a purely primal Constraint Gradient Method (CGM), the first algorithm achieving non-asymptotic convergence without any prior information about Lagrange multipliers. Theoretically, under monotonicity and strong monotonicity assumptions, CGM attains the optimal operator query complexity—matching that of projection-based methods—while solving only lightweight quadratic programming subproblems per iteration, thereby substantially reducing computational overhead. Numerical experiments demonstrate that CGM is both efficient and robust on multi-constrained VI problems, consistently outperforming state-of-the-art primal-dual methods in practice.

Technology Category

Application Category

📝 Abstract
Variational inequality problems are recognized for their broad applications across various fields including machine learning and operations research. First-order methods have emerged as the standard approach for solving these problems due to their simplicity and scalability. However, they typically rely on projection or linear minimization oracles to navigate the feasible set, which becomes computationally expensive in practical scenarios featuring multiple functional constraints. Existing efforts to tackle such functional constrained variational inequality problems have centered on primal-dual algorithms grounded in the Lagrangian function. These algorithms along with their theoretical analysis often require the existence and prior knowledge of the optimal Lagrange multipliers. In this work, we propose a simple primal method, termed Constrained Gradient Method (CGM), for addressing functional constrained variational inequality problems, without requiring any information on the optimal Lagrange multipliers. We establish a non-asymptotic convergence analysis of the algorithm for Minty variational inequality problems with monotone operators under smooth constraints. Remarkably, our algorithms match the complexity of projection-based methods in terms of operator queries for both monotone and strongly monotone settings, while using significantly cheaper oracles based on quadratic programming. Furthermore, we provide several numerical examples to evaluate the efficacy of our algorithms.
Problem

Research questions and friction points this paper is trying to address.

Solves variational inequality problems with functional constraints efficiently
Eliminates need for optimal Lagrange multipliers knowledge
Uses cheaper quadratic programming oracles than projection methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Proposes Constrained Gradient Method (CGM)
Eliminates need for optimal Lagrange multipliers
Uses cheaper quadratic programming oracles