Uncertainty quantification of neural network models of evolving processes via Langevin sampling

📅 2025-04-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses uncertainty modeling for history-dependent dynamical processes—such as chemical reactions and material evolution—by proposing a joint Bayesian data model integrating neural ordinary differential equations (NODEs) with a trainable observation model. Methodologically, it introduces the first framework embedding Langevin dynamics sampling within a hypernetwork architecture, enabling end-to-end joint learning of data model parameters and the posterior score function, thereby balancing computational efficiency and posterior approximation accuracy. Leveraging score-matching-driven Bayesian approximate inference, the approach circumvents restrictive mean-field assumptions. Experiments on diverse physicochemical datasets demonstrate substantial improvements over mean-field variational inference, enabling high-confidence quantification of parameter uncertainties and efficient posterior sampling.

Technology Category

Application Category

📝 Abstract
We propose a scalable, approximate inference hypernetwork framework for a general model of history-dependent processes. The flexible data model is based on a neural ordinary differential equation (NODE) representing the evolution of internal states together with a trainable observation model subcomponent. The posterior distribution corresponding to the data model parameters (weights and biases) follows a stochastic differential equation with a drift term related to the score of the posterior that is learned jointly with the data model parameters. This Langevin sampling approach offers flexibility in balancing the computational budget between the evaluation cost of the data model and the approximation of the posterior density of its parameters. We demonstrate performance of the hypernetwork on chemical reaction and material physics data and compare it to mean-field variational inference.
Problem

Research questions and friction points this paper is trying to address.

Quantify uncertainty in neural network models of evolving processes
Develop scalable inference for history-dependent processes via hypernetworks
Balance computational cost between model evaluation and posterior approximation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Scalable hypernetwork framework for history-dependent processes
Neural ODE with trainable observation model
Langevin sampling for posterior distribution approximation
🔎 Similar Papers
No similar papers found.
Cosmin Safta
Cosmin Safta
Sandia National Labs
combustionuncertainty quantificationmachine learningBayesian inference
Reese E. Jones
Reese E. Jones
Sandia National Laboratories
physicschemistrymechanical engineeringcomputational science
R
Ravi G. Patel
Sandia National Laboratories, Albuquerque, NM 87185
R
Raelynn Wonnacot
University of Maryland, College Park, MD 20742
D
Dan S. Bolintineanu
Sandia National Laboratories, Albuquerque, NM 87185
C
Craig M. Hamel
Sandia National Laboratories, Albuquerque, NM 87185
S
Sharlotte L.B. Kramer
Sandia National Laboratories, Albuquerque, NM 87185