Learning from Samples: Inverse Problems over measures via Sharpened Fenchel-Young Losses

📅 2025-05-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the stable inversion of implicit parameters—such as transport costs or potential functions—governing static optimization (e.g., optimal transport) or dynamic evolution (e.g., gradient flows) from limited samples. We formulate two inverse problems: inverse unbalanced optimal transport (iUOT) and inverse JKO gradient flow (iJKO). To solve them, we propose the sharpened Fenchel–Young loss—the first differentiable, strongly convex loss defined directly on the space of measures and quantified by suboptimality gap. Theoretically, we establish explicit finite-sample stability guarantees for parameter recovery. Empirically, we validate consistency and practicality of parameter estimation on Gaussian models. Our key innovations are: (i) embedding the optimization gap into the loss design to enable gradient-based inference, and (ii) developing a unified stability analysis framework in the measure space that applies to both static and dynamic inverse problems.

Technology Category

Application Category

📝 Abstract
Estimating parameters from samples of an optimal probability distribution is essential in applications ranging from socio-economic modeling to biological system analysis. In these settings, the probability distribution arises as the solution to an optimization problem that captures either static interactions among agents or the dynamic evolution of a system over time. Our approach relies on minimizing a new class of loss functions, called sharpened Fenchel-Young losses, which measure the sub-optimality gap of the optimization problem over the space of measures. We study the stability of this estimation method when only a finite number of sample is available. The parameters to be estimated typically correspond to a cost function in static problems and to a potential function in dynamic problems. To analyze stability, we introduce a general methodology that leverages the strong convexity of the loss function together with the sample complexity of the forward optimization problem. Our analysis emphasizes two specific settings in the context of optimal transport, where our method provides explicit stability guarantees: The first is inverse unbalanced optimal transport (iUOT) with entropic regularization, where the parameters to estimate are cost functions that govern transport computations; this method has applications such as link prediction in machine learning. The second is inverse gradient flow (iJKO), where the objective is to recover a potential function that drives the evolution of a probability distribution via the Jordan-Kinderlehrer-Otto (JKO) time-discretization scheme; this is particularly relevant for understanding cell population dynamics in single-cell genomics. Finally, we validate our approach through numerical experiments on Gaussian distributions, where closed-form solutions are available, to demonstrate the practical performance of our methods
Problem

Research questions and friction points this paper is trying to address.

Estimating parameters from optimal probability distribution samples
Minimizing sharpened Fenchel-Young losses for sub-optimality gap
Stability analysis in inverse unbalanced optimal transport and gradient flow
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses sharpened Fenchel-Young losses for optimization
Estimates cost and potential functions via samples
Applies to inverse unbalanced optimal transport
🔎 Similar Papers
No similar papers found.