Exploring Efficient Quantification of Modeling Uncertainties with Differentiable Physics-Informed Machine Learning Architectures

📅 2025-06-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenge of quantifying and propagating uncertainty in physics-informed machine learning (PIML), which limits reliability analysis and robust optimization. We propose a differentiable uncertainty-aware architecture that embeds Bayesian neural networks into a differentiable hybrid physics model, enabling end-to-end uncertainty propagation. A two-stage training strategy is introduced to enhance convergence stability. The method integrates physical constraints, automatic differentiation, Bayesian inference, and Monte Carlo sampling to jointly estimate predictive means and uncertainties in a fully differentiable manner. Evaluations on benchmark functions and real-world fixed-wing UAV flight data demonstrate that our approach achieves prediction accuracy comparable to state-of-the-art PIML models, attains >92% uncertainty coverage, and significantly improves propagation fidelity via Monte Carlo sampling. This provides a reliable, differentiable uncertainty quantification framework for model-driven engineering design and control.

Technology Category

Application Category

📝 Abstract
Quantifying and propagating modeling uncertainties is crucial for reliability analysis, robust optimization, and other model-based algorithmic processes in engineering design and control. Now, physics-informed machine learning (PIML) methods have emerged in recent years as a new alternative to traditional computational modeling and surrogate modeling methods, offering a balance between computing efficiency, modeling accuracy, and interpretability. However, their ability to predict and propagate modeling uncertainties remains mostly unexplored. In this paper, a promising class of auto-differentiable hybrid PIML architectures that combine partial physics and neural networks or ANNs (for input transformation or adaptive parameter estimation) is integrated with Bayesian Neural networks (replacing the ANNs); this is done with the goal to explore whether BNNs can successfully provision uncertainty propagation capabilities in the PIML architectures as well, further supported by the auto-differentiability of these architectures. A two-stage training process is used to alleviate the challenges traditionally encountered in training probabilistic ML models. The resulting BNN-integrated PIML architecture is evaluated on an analytical benchmark problem and flight experiments data for a fixed-wing RC aircraft, with prediction performance observed to be slightly worse or at par with purely data-driven ML and original PIML models. Moreover, Monte Carlo sampling of probabilistic BNN weights was found to be most effective in propagating uncertainty in the BNN-integrated PIML architectures.
Problem

Research questions and friction points this paper is trying to address.

Quantify modeling uncertainties in physics-informed machine learning
Explore Bayesian Neural Networks for uncertainty propagation
Evaluate hybrid PIML architectures on engineering applications
Innovation

Methods, ideas, or system contributions that make the work stand out.

Differentiable hybrid PIML with Bayesian Neural Networks
Two-stage training for probabilistic ML models
Monte Carlo sampling for uncertainty propagation
🔎 Similar Papers
No similar papers found.