Regularity of Solutions to Beckmann's Parametric Optimal Transport

📅 2026-03-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the regularity of solutions to the parameter-dependent Beckmann optimal transport problem. By deriving first-order optimality conditions through an unconstrained Lagrangian formulation, the flux field is expressed as the gradient of a potential function. Combining Schauder estimates for elliptic PDEs with Hölder space analysis, the study establishes—for the first time—joint and separate Hölder continuity of the flux field with respect to both source/target distributions and parameters. The results are further extended to probability paths such as Fisher–Rao gradient flows, yielding precise Hölder exponents for the potential, flux, and their induced flows under given density regularity assumptions. Moreover, the paper proves that these solutions can be efficiently approximated by deep ReQu neural networks.

Technology Category

Application Category

📝 Abstract
Beckmann's problem in optimal transport minimizes the total squared flux in a continuous transport problem from a source to a target distribution. In this article, the regularity theory for solutions to Beckmann's problem in optimal transport is developed utilizing an unconstrained Lagrangian formulation and solving the variational first order optimality conditions. It turns out that the Lagrangian multiplier that enforces Beckmann's divergence constraint fulfills a Poisson equation and the flux vector field is obtained as the potential's gradient. Utilizing Schauder estimates from elliptic regularity theory, the exact Hölder regularity of the potential, the flux and the flow generating is derived on the basis of Hölder regularity of source and target densities on a bounded, regular domain. If the target distribution depends on parameters, as is the case in conditional (``promptable'') generative learning, we provide sufficient conditions for separate and joint Hölder continuity of the resulting vector field in the parameter and the data dimension. Following a recent result by Belomnestny et al., one can thus approximate such vector fields with deep ReQu neural networks in C^(k,alpha)-Hölder norm. We also show that this approach generalizes to other probability paths, like Fisher-Rao gradient flows.
Problem

Research questions and friction points this paper is trying to address.

optimal transport
Beckmann's problem
regularity theory
Hölder continuity
parametric distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Beckmann's problem
Hölder regularity
elliptic regularity theory
parametric optimal transport
deep ReQu networks
🔎 Similar Papers
No similar papers found.