Faster Rates For Federated Variational Inequalities

📅 2026-02-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the significantly slower convergence rates of stochastic variational inequalities (VIs) in federated learning compared to federated convex optimization. To tackle this gap, the authors propose a new algorithm, LIPPAX, which integrates a local inexact proximal point method with an extra-step technique. Through refined convergence analysis under structural assumptions—such as bounded Hessian, monotone and smooth operators, and low variance—LIPPAX effectively mitigates client drift inherent in Local Extra SGD. The algorithm achieves tighter convergence bounds for general smooth monotone VIs and demonstrates superior convergence rates over existing methods across various settings. Furthermore, LIPPAX naturally extends to composite variational inequality problems, broadening its applicability within federated learning frameworks.

Technology Category

Application Category

📝 Abstract
In this paper, we study federated optimization for solving stochastic variational inequalities (VIs), a problem that has attracted growing attention in recent years. Despite substantial progress, a significant gap remains between existing convergence rates and the state-of-the-art bounds known for federated convex optimization. In this work, we address this limitation by establishing a series of improved convergence rates. First, we show that, for general smooth and monotone variational inequalities, the classical Local Extra SGD algorithm admits tighter guarantees under a refined analysis. Next, we identify an inherent limitation of Local Extra SGD, which can lead to excessive client drift. Motivated by this observation, we propose a new algorithm, the Local Inexact Proximal Point Algorithm with Extra Step (LIPPAX), and show that it mitigates client drift and achieves improved guarantees in several regimes, including bounded Hessian, bounded operator, and low-variance settings. Finally, we extend our results to federated composite variational inequalities and establish improved convergence guarantees.
Problem

Research questions and friction points this paper is trying to address.

federated optimization
stochastic variational inequalities
convergence rates
client drift
federated learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated Variational Inequalities
Client Drift
Local Extra SGD
LIPPAX
Convergence Rates
🔎 Similar Papers
No similar papers found.