Deep Neural networks for solving high-dimensional parabolic partial differential equations

📅 2026-01-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the long-standing challenge of the curse of dimensionality in numerically solving high-dimensional parabolic partial differential equations (PDEs), where traditional grid-based methods fail to scale. The study systematically categorizes existing deep learning approaches into three distinct paradigms: physics-informed neural networks (PINNs) based on PDE residuals, stochastic methods rooted in the Feynman–Kac formula and backward stochastic differential equations (BSDEs), and derivative-free hybrid stochastic finite difference schemes. By establishing a unified framework, the paper clarifies the conceptual relationships and applicability boundaries among these paradigms and provides a comprehensive evaluation of their respective strengths and limitations. The scalability, accuracy, and practicality of the reviewed methods are rigorously demonstrated on benchmark problems—including Hamilton–Jacobi–Bellman and Black–Scholes equations—up to 1,000 dimensions.

Technology Category

Application Category

📝 Abstract
The numerical solution of high dimensional partial differential equations (PDEs) is severely constrained by the curse of dimensionality (CoD), rendering classical grid--based methods impractical beyond a few dimensions. In recent years, deep neural networks have emerged as a promising mesh free alternative, enabling the approximation of PDE solutions in tens to thousands of dimensions. This review provides a tutorial--oriented introduction to neural--network--based methods for solving high dimensional parabolic PDEs, emphasizing conceptual clarity and methodological connections. We organize the literature around three unifying paradigms: (i) PDE residual--based approaches, including physicsinformed neural networks and their high dimensional variants; (ii) stochastic methods derived from Feynman--Kac and backward stochastic differential equation formulations; and (iii) hybrid derivative--free random difference approaches designed to alleviate the computational cost of derivatives in high dimensions. For each paradigm, we outline the underlying mathematical formulation, algorithmic implementation, and practical strengths and limitations. Representative benchmark problems--including Hamilton--Jacobi--Bellman and Black--Scholes equations in up to 1000 dimensions --illustrate the scalability, effectiveness, and accuracy of the methods. The paper concludes with a discussion of open challenges and future directions for reliable and scalable solvers of high dimensional PDEs.
Problem

Research questions and friction points this paper is trying to address.

high-dimensional PDEs
curse of dimensionality
parabolic partial differential equations
numerical solution
Innovation

Methods, ideas, or system contributions that make the work stand out.

deep neural networks
high-dimensional PDEs
physics-informed neural networks
Feynman–Kac formula
curse of dimensionality
🔎 Similar Papers
No similar papers found.
Wenzhong Zhang
Wenzhong Zhang
Suzhou Institute for Advanced Research, University of Science and Technology of China
Z
Zhenyuan Hu
Department of Computer Science, School of Computing, National University of Singapore, 119077, Singapore
Wei Cai
Wei Cai
Clements Chair Professor in Applied Math, Southern Methodist University
Computational ElectromagneticsStochastic Methods
G
G. Karniadakis
Division of Applied Mathematics, Brown University, Providence, RI 02912, USA