Multilevel Picard approximations and deep neural networks with ReLU, leaky ReLU, and softplus activation overcome the curse of dimensionality when approximating semilinear parabolic partial differential equations in Lp-sense

📅 2024-09-30
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
High-dimensional semilinear parabolic PDEs—where the nonlinearity is gradient-independent and Lipschitz continuous—suffer from the “curse of dimensionality” in conventional numerical methods. Method: This paper proposes an Lᵖ-approximation scheme combining multilevel Picard iteration with deep neural networks employing ReLU, leaky ReLU, or softplus activation functions. Contribution/Results: We provide the first rigorous Lᵖ-error analysis proving that, for a prescribed accuracy ε > 0, both the computational complexity and the number of network parameters scale polynomially in the dimension d and ε⁻¹—thereby fully overcoming the curse of dimensionality. Unlike classical methods, our approach applies to arbitrary dimension d and delivers stronger theoretical guarantees: an Lᵖ approximation error ≤ ε is achievable with controllable polynomial cost, without requiring prior knowledge of the solution’s gradients or assumptions on its regularity.

Technology Category

Application Category

📝 Abstract
We prove that multilevel Picard approximations and deep neural networks with ReLU, leaky ReLU, and softplus activation are capable of approximating solutions of semilinear Kolmogorov PDEs in $L^mathfrak{p}$-sense, $mathfrak{p}in [2,infty)$, in the case of gradient-independent, Lipschitz-continuous nonlinearities, while the computational effort of the multilevel Picard approximations and the required number of parameters in the neural networks grow at most polynomially in both dimension $din mathbb{N}$ and reciprocal of the prescribed accuracy $epsilon$.
Problem

Research questions and friction points this paper is trying to address.

High-dimensional Partial Differential Equations
Nonlinear Parabolic PDEs
Dimensionality Curse
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-level Picard Iteration
Deep Neural Networks
High-dimensional Partial Differential Equations
🔎 Similar Papers
No similar papers found.