Accelerated Multiple Wasserstein Gradient Flows for Multi-objective Distributional Optimization

📅 2026-01-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses multi-objective optimization over probability distributions in Wasserstein space, with the goal of efficiently computing weak Pareto optimal solutions. To this end, the authors propose the Accelerated Multi-objective Wasserstein Gradient Descent (A-MWGraD) algorithm, which for the first time incorporates a Nesterov-type acceleration mechanism into this setting. By leveraging the geometric structure of Wasserstein space, geodesic convexity theory, and continuous-time dynamical analysis, A-MWGraD achieves convergence rates of $O(1/t^2)$ or even exponential decay—substantially improving upon the $O(1/t)$ rates of existing methods. Implemented via kernel-based discretization, the algorithm demonstrates markedly faster convergence and higher sampling efficiency in multi-objective sampling tasks.

Technology Category

Application Category

📝 Abstract
We study multi-objective optimization over probability distributions in Wasserstein space. Recently, Nguyen et al. (2025) introduced Multiple Wasserstein Gradient Descent (MWGraD) algorithm, which exploits the geometric structure of Wasserstein space to jointly optimize multiple objectives. Building on this approach, we propose an accelerated variant, A-MWGraD, inspired by Nesterov's acceleration. We analyze the continuous-time dynamics and establish convergence to weakly Pareto optimal points in probability space. Our theoretical results show that A-MWGraD achieves a convergence rate of O(1/t^2) for geodesically convex objectives and O(e^{-\sqrt{\beta}t}) for $\beta$-strongly geodesically convex objectives, improving upon the O(1/t) rate of MWGraD in the geodesically convex setting. We further introduce a practical kernel-based discretization for A-MWGraD and demonstrate through numerical experiments that it consistently outperforms MWGraD in convergence speed and sampling efficiency on multi-target sampling tasks.
Problem

Research questions and friction points this paper is trying to address.

multi-objective optimization
Wasserstein space
distributional optimization
Pareto optimality
gradient flows
Innovation

Methods, ideas, or system contributions that make the work stand out.

Wasserstein gradient flow
multi-objective optimization
Nesterov acceleration
geodesic convexity
distributional optimization
🔎 Similar Papers
No similar papers found.