🤖 AI Summary
This work addresses the inverse problem of quantifying uncertainty in posterior distributions of unknown parameters in dynamical systems. We propose a novel optimization framework based on the squared 2-Wasserstein distance with local temporal decoupling. The unknown posterior is parameterized using a stochastic neural network, and a local temporal decoupling mechanism is introduced—specifically, the 2-Wasserstein distance minimization is decomposed into independent subproblems at each time step, thereby circumventing computational bottlenecks and convergence issues arising from global temporal coupling. A differentiable loss function is constructed by integrating variational inference with forward simulation of the dynamical system. Experiments across multiple nonlinear dynamical systems demonstrate that our method achieves significantly improved accuracy, stability, and training robustness in posterior distribution reconstruction, outperforming conventional approaches based on KL divergence or globally coupled Wasserstein optimization.
📝 Abstract
In this work, we propose and analyze a new local time-decoupled squared Wasserstein-2 method for reconstructing the distribution of unknown parameters in dynamical systems. Specifically, we show that a stochastic neural network model, which can be effectively trained by minimizing our proposed local time-decoupled squared Wasserstein-2 loss function, is an effective model for approximating the distribution of uncertain model parameters in dynamical systems. Through several numerical examples, we showcase the effectiveness of our proposed method in reconstructing the distribution of parameters in different dynamical systems.