🤖 AI Summary
Mapping between sequences of continuous probability distributions arises in generative modeling, yet existing optimal transport (OT) methods suffer from high computational cost due to explicit joint distribution estimation or Wasserstein distance computation.
Method: We establish a rigorous equivalence between Action Matching (AM) and OT by formulating distributional transport as learning an optimal vector field for a generative ordinary differential equation (ODE). Under compatible boundary conditions and energy functional design, the AM solution inherently satisfies the Kantorovich duality optimality criterion.
Contribution/Results: This equivalence reveals that OT’s optimal coupling is fully characterized by a specific class of integrable vector fields—bypassing explicit joint distribution optimization or Wasserstein metric evaluation. Empirically, AM preserves theoretical optimality while achieving substantial computational efficiency gains. Our work provides a novel theoretical foundation and optimization paradigm for flow-matching-based generative modeling.
📝 Abstract
Flow Matching (FM) method in generative modeling maps arbitrary probability distributions by constructing an interpolation between them and then learning the vector field that defines ODE for this interpolation. Recently, it was shown that FM can be modified to map distributions optimally in terms of the quadratic cost function for any initial interpolation. To achieve this, only specific optimal vector fields, which are typical for solutions of Optimal Transport (OT) problems, need to be considered during FM loss minimization. In this note, we show that considering only optimal vector fields can lead to OT in another approach: Action Matching (AM). Unlike FM, which learns a vector field for a manually chosen interpolation between given distributions, AM learns the vector field that defines ODE for an entire given sequence of distributions.