🤖 AI Summary
This work investigates the regularity of solutions to the parameter-dependent Beckmann optimal transport problem. By deriving first-order optimality conditions through an unconstrained Lagrangian formulation, the flux field is expressed as the gradient of a potential function. Combining Schauder estimates for elliptic PDEs with Hölder space analysis, the study establishes—for the first time—joint and separate Hölder continuity of the flux field with respect to both source/target distributions and parameters. The results are further extended to probability paths such as Fisher–Rao gradient flows, yielding precise Hölder exponents for the potential, flux, and their induced flows under given density regularity assumptions. Moreover, the paper proves that these solutions can be efficiently approximated by deep ReQu neural networks.
📝 Abstract
Beckmann's problem in optimal transport minimizes the total squared flux in a continuous transport problem from a source to a target distribution. In this article, the regularity theory for solutions to Beckmann's problem in optimal transport is developed utilizing an unconstrained Lagrangian formulation and solving the variational first order optimality conditions. It turns out that the Lagrangian multiplier that enforces Beckmann's divergence constraint fulfills a Poisson equation and the flux vector field is obtained as the potential's gradient. Utilizing Schauder estimates from elliptic regularity theory, the exact Hölder regularity of the potential, the flux and the flow generating is derived on the basis of Hölder regularity of source and target densities on a bounded, regular domain. If the target distribution depends on parameters, as is the case in conditional (``promptable'') generative learning, we provide sufficient conditions for separate and joint Hölder continuity of the resulting vector field in the parameter and the data dimension. Following a recent result by Belomnestny et al., one can thus approximate such vector fields with deep ReQu neural networks in C^(k,alpha)-Hölder norm. We also show that this approach generalizes to other probability paths, like Fisher-Rao gradient flows.