🤖 AI Summary
This work addresses the optimization challenge in generative modeling under non-log-concave target distributions. Methodologically, it formulates sampling as an optimal transport problem via Brenier maps and solves it iteratively in the Brenier map space using mirror gradient descent; crucially, it introduces no-regret online optimization theory into this parabolic Monge–Ampère PDE system for the first time, establishing an adaptive evolution variational inequality to guarantee convergence. Theoretically, convergence to the optimal Brenier map is proven under multiple step-size schedules; empirically, the framework achieves high-fidelity sampling. Key contributions include: (i) the first integration of parabolic Monge–Ampère PDEs with no-regret analysis; (ii) unification of GAN-style adversarial learning and diffusion-model-based score matching within a single geometric PDE framework; and (iii) provision of a novel geometric dynamical systems perspective for both generative modeling and variational inference.
📝 Abstract
We introduce a novel generative modeling framework based on a discretized parabolic Monge-Amp`ere PDE, which emerges as a continuous limit of the Sinkhorn algorithm commonly used in optimal transport. Our method performs iterative refinement in the space of Brenier maps using a mirror gradient descent step. We establish theoretical guarantees for generative modeling through the lens of no-regret analysis, demonstrating that the iterates converge to the optimal Brenier map under a variety of step-size schedules. As a technical contribution, we derive a new Evolution Variational Inequality tailored to the parabolic Monge-Amp`ere PDE, connecting geometry, transportation cost, and regret. Our framework accommodates non-log-concave target distributions, constructs an optimal sampling process via the Brenier map, and integrates favorable learning techniques from generative adversarial networks and score-based diffusion models. As direct applications, we illustrate how our theory paves new pathways for generative modeling and variational inference.