Solving partial differential equations with sampled neural networks

📅 2024-05-31
🏛️ arXiv.org
📈 Citations: 5
Influential: 0
📄 PDF
🤖 AI Summary
To address the gradient optimization difficulties and non-causal temporal treatment inherent in physics-informed neural networks (PINNs) for time-dependent partial differential equations (PDEs), this work proposes a gradient-free, causally structured stochastic neural basis function method. Spatially, it constructs neural basis functions with random weights in the hidden layer; temporally, it explicitly integrates time evolution via classical ODE solvers. We introduce a novel dual-mode weight sampling strategy—data-agnostic and data-aware—and establish its $L^2$ convergence in Barron space theoretically. The method combines mesh-free flexibility with spectral convergence accuracy. It enables long-time-domain simulation and inverse problem solving. Numerical experiments across diverse elliptic and time-dependent PDEs demonstrate 1–2 orders-of-magnitude improvements in training speed and accuracy over PINNs, alongside strong generalization capability and numerical stability.

Technology Category

Application Category

📝 Abstract
Approximation of solutions to partial differential equations (PDE) is an important problem in computational science and engineering. Using neural networks as an ansatz for the solution has proven a challenge in terms of training time and approximation accuracy. In this contribution, we discuss how sampling the hidden weights and biases of the ansatz network from data-agnostic and data-dependent probability distributions allows us to progress on both challenges. In most examples, the random sampling schemes outperform iterative, gradient-based optimization of physics-informed neural networks regarding training time and accuracy by several orders of magnitude. For time-dependent PDE, we construct neural basis functions only in the spatial domain and then solve the associated ordinary differential equation with classical methods from scientific computing over a long time horizon. This alleviates one of the greatest challenges for neural PDE solvers because it does not require us to parameterize the solution in time. For second-order elliptic PDE in Barron spaces, we prove the existence of sampled networks with $L^2$ convergence to the solution. We demonstrate our approach on several time-dependent and static PDEs. We also illustrate how sampled networks can effectively solve inverse problems in this setting. Benefits compared to common numerical schemes include spectral convergence and mesh-free construction of basis functions.
Problem

Research questions and friction points this paper is trying to address.

Accelerating physics-informed neural network training without gradient descent
Improving accuracy for time-dependent partial differential equation solutions
Incorporating temporal causality in neural PDE solver architectures
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses random features instead of gradient descent
Incorporates temporal causality by construction
Leverages space-time separation for PDE solving
🔎 Similar Papers
No similar papers found.