Goal-oriented learning of stochastic dynamical systems using error bounds on path-space observables

📅 2026-03-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge that conventional surrogate models struggle to provide rigorous error guarantees for path-dependent observables—such as first-passage times—when learning stochastic dynamical systems, often leading to inaccurate predictions of critical statistics. To overcome this limitation, the authors propose a goal-oriented learning framework that, for the first time, establishes general error bounds for observables defined on path space and leverages these bounds to construct a variational loss function. The approach accommodates complex functionals over unbounded time horizons, including first-passage times, and derives analytical gradients via Fréchet derivatives to enable efficient optimization with stochastic gradient descent. Numerical experiments demonstrate that the resulting surrogate models achieve significantly improved accuracy in predicting first-passage time statistics and exhibit enhanced robustness under distributional shifts in the data.

Technology Category

Application Category

📝 Abstract
The governing equations of stochastic dynamical systems often become cost-prohibitive for numerical simulation at large scales. Surrogate models of the governing equations, learned from data of the high-fidelity system, are routinely used to predict key observables with greater efficiency. However, standard choices of loss function for learning the surrogate model fail to provide error guarantees in path-dependent observables, such as reaction rates of molecular dynamical systems. This paper introduces an error bound for path-space observables and employs it as a novel variational loss for the goal-oriented learning of a stochastic dynamical system. We show the error bound holds for a broad class of observables, including mean first hitting times on unbounded time domains. We derive an analytical gradient of the goal-oriented loss function by leveraging the formula for Frechet derivatives of expected path functionals, which remains tractable for implementation in stochastic gradient descent schemes. We demonstrate that surrogate models of overdamped Langevin systems developed via goal-oriented learning achieve improved accuracy in predicting the statistics of a first hitting time observable and robustness to distributional shift in the data.
Problem

Research questions and friction points this paper is trying to address.

stochastic dynamical systems
path-space observables
surrogate models
error bounds
goal-oriented learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

goal-oriented learning
path-space observables
error bounds
stochastic dynamical systems
Frechet derivative
🔎 Similar Papers
No similar papers found.