🤖 AI Summary
Data-driven decision-making often fails in deployment due to distributional shifts arising from contextual dependencies, partial observability, or stress perturbations. This work proposes a unified framework for flow- and score-based generative models that leverages pushforward mappings, velocity fields, and score fields, integrated with the Fokker–Planck equation and Wasserstein geometry to construct, transform, and optimize uncertainty distributions in probability space. By doing so, the approach elevates generative models from mere sample synthesis tools to principled mathematical mechanisms for distributional manipulation, enabling robustness analysis, conditional inference, and stress-scenario generation. The method is supported by theoretical guarantees, including forward–backward convergence, first-order minimax optimality in the space of transport maps, and error bounds for posterior sampling with generative priors.
📝 Abstract
Many data-driven decision problems are formulated using a nominal distribution estimated from historical data, while performance is ultimately determined by a deployment distribution that may be shifted, context-dependent, partially observed, or stress-induced. This tutorial presents modern generative models, particularly flow- and score-based methods, as mathematical tools for constructing decision-relevant distributions. From an operations research perspective, their primary value lies not in unconstrained sample synthesis but in representing and transforming distributions through transport maps, velocity fields, score fields, and guided stochastic dynamics. We present a unified framework based on pushforward maps, continuity, Fokker-Planck equations, Wasserstein geometry, and optimization in probability space. Within this framework, generative models can be used to learn nominal uncertainty, construct stressed or least-favorable distributions for robustness, and produce conditional or posterior distributions under side information and partial observation. We also highlight representative theoretical guarantees, including forward-reverse convergence for iterative flow models, first-order minimax analysis in transport-map space, and error-transfer bounds for posterior sampling with generative priors. The tutorial provides a principled introduction to using generative models for scenario generation, robust decision-making, uncertainty quantification, and related problems under distributional shift.