🤖 AI Summary
This work addresses the limitations of Wasserstein gradient descent in generative modeling—namely, its reliance on prior knowledge of the target potential function or costly neural network training. We propose the Koopman Spectral Wasserstein Gradient Descent (KSWGD) framework, the first to deeply integrate Koopman operator spectral analysis with optimal transport. KSWGD directly estimates the spectral structure from trajectory data, enabling potential-free and training-free generative dynamical modeling. Theoretically, we establish rigorous connections between KSWGD and the Feynman–Kac formula as well as solutions to stochastic partial differential equations (SPDEs). Algorithmically, the method evolves the Wasserstein gradient flow via Koopman spectral approximation. Experiments demonstrate that KSWGD achieves significantly faster convergence than state-of-the-art methods on manifold sampling, multiscale/multistable system simulation, image generation, and high-dimensional SPDE solving—while preserving sample quality.
📝 Abstract
We propose Koopman Spectral Wasserstein Gradient Descent (KSWGD), a generative modeling framework that combines operator-theoretic spectral analysis with optimal transport. The novel insight is that the spectral structure required for accelerated Wasserstein gradient descent can be directly estimated from trajectory data via Koopman operator approximation which can eliminate the need for explicit knowledge of the target potential or neural network training. We provide rigorous convergence analysis and establish connection to Feynman-Kac theory that clarifies the method's probabilistic foundation. Experiments across diverse settings, including compact manifold sampling, metastable multi-well systems, image generation, and high dimensional stochastic partial differential equation, demonstrate that KSWGD consistently achieves faster convergence than other existing methods while maintaining high sample quality.