Verifiable Error Bounds for Physics-Informed Neural Network Solutions of Lyapunov and Hamilton-Jacobi-Bellman Equations

📅 2026-03-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the lack of verifiable error guarantees in existing physics-informed neural networks (PINNs) when solving Lyapunov and Hamilton-Jacobi-Bellman (HJB) equations—specifically, whether small PDE residuals imply small solution errors remains unclear. To bridge this gap, we develop the first theoretical framework that provides rigorous, verifiable error bounds for PINN approximations of these critical partial differential equations arising in nonlinear system analysis and control. Our approach converts residual bounds into relative error bounds and a posteriori estimates for the true solution, proves that one-sided residual bounds suffice to guarantee the PINN approximation itself constitutes a valid Lyapunov function, and delivers computable upper and lower bounds on the optimal value function along with quantified optimality gaps for feedback policies in HJB problems. Numerical experiments demonstrate the effectiveness and practical utility of the proposed methodology.

Technology Category

Application Category

📝 Abstract
Many core problems in nonlinear systems analysis and control can be recast as solving partial differential equations (PDEs) such as Lyapunov and Hamilton-Jacobi-Bellman (HJB) equations. Physics-informed neural networks (PINNs) have emerged as a promising mesh-free approach for approximating their solutions, but in most existing works there is no rigorous guarantee that a small PDE residual implies a small solution error. This paper develops verifiable error bounds for approximate solutions of Lyapunov and HJB equations, with particular emphasis on PINN-based approximations. For both the Lyapunov and HJB PDEs, we show that a verifiable residual bound yields relative error bounds with respect to the true solutions as well as computable a posteriori estimates in terms of the approximate solutions. For the HJB equation, this also yields certified upper and lower bounds on the optimal value function on compact sublevel sets and quantifies the optimality gap of the induced feedback policy. We further show that one-sided residual bounds already imply that the approximation itself defines a valid Lyapunov or control Lyapunov function. We illustrate the results with numerical examples.
Problem

Research questions and friction points this paper is trying to address.

Physics-Informed Neural Networks
Lyapunov Equations
Hamilton-Jacobi-Bellman Equations
Error Bounds
Verification
Innovation

Methods, ideas, or system contributions that make the work stand out.

verifiable error bounds
physics-informed neural networks
Lyapunov equations
Hamilton-Jacobi-Bellman equations
a posteriori error estimation
🔎 Similar Papers
No similar papers found.