Learning the Exact Time Integration Algorithm for Initial Value Problems by Randomized Neural Networks

📅 2025-02-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses time integration for initial value problems (IVPs) by proposing a novel stochastic neural network method that synergistically integrates extreme learning machines (ELMs) with physics-informed neural networks (PINNs), establishing a general, analytically exact integration framework for nonautonomous differential systems. For the first time, ELMs are incorporated into physics-constrained functional learning, leveraging intrinsic symmetries induced by periodicity of the system’s right-hand-side function to substantially reduce modeling complexity. The method enables high-fidelity temporal evolution for arbitrary initial conditions and step sizes. It achieves long-term, high-accuracy solutions across nonstiff, stiff, and chaotic systems: temporal discretization errors decay nearly exponentially with network degrees of freedom; computational efficiency matches or surpasses state-of-the-art classical integrators—particularly excelling in multi-initial-condition and variable-step-size scenarios.

Technology Category

Application Category

📝 Abstract
We present a method leveraging extreme learning machine (ELM) type randomized neural networks (NNs) for learning the exact time integration algorithm for initial value problems (IVPs). The exact time integration algorithm for non-autonomous systems can be represented by an algorithmic function in higher dimensions, which satisfies an associated system of partial differential equations with corresponding boundary conditions. Our method learns the algorithmic function by solving this associated system using ELM with a physics informed approach. The trained ELM network serves as the learned algorithm and can be used to solve the IVP with arbitrary initial data or step sizes from some domain. When the right hand side of the non-autonomous system exhibits a periodicity with respect to any of its arguments, while the solution itself to the problem is not periodic, we show that the algorithmic function is either periodic, or when it is not, satisfies a well-defined relation for different periods. This property can greatly simplify the algorithm learning in many problems. We consider explicit and implicit NN formulations, leading to explicit or implicit time integration algorithms, and discuss how to train the ELM network by the nonlinear least squares method. Extensive numerical experiments with benchmark problems, including non-stiff, stiff and chaotic systems, show that the learned NN algorithm produces highly accurate solutions in long-time simulations, with its time-marching errors decreasing nearly exponentially with increasing degrees of freedom in the neural network. We compare extensively the computational performance (accuracy vs.~cost) between the current NN algorithm and the leading traditional time integration algorithms. The learned NN algorithm is computationally competitive, markedly outperforming the traditional algorithms in many problems.
Problem

Research questions and friction points this paper is trying to address.

Exact time integration for initial value problems
Physics-informed randomized neural networks learning
Periodicity exploitation in non-autonomous systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses ELM type randomized neural networks
Learns exact time integration algorithm
Applies physics informed approach
🔎 Similar Papers
No similar papers found.
Suchuan Dong
Suchuan Dong
Department of Mathematics, Purdue University
computational fluid dynamicsnumerical methods and algorithmsphase fieldscientific computingturbulence
N
Naxian Ni
Center for Computational and Applied Mathematics, Department of Mathematics, Purdue University, USA