🤖 AI Summary
This work addresses the challenge of learning dynamics for non-periodic inputs and non-Markovian stochastic systems—such as path-dependent stochastic differential equations (SDEs) and Lipschitz transformations of fractional Brownian motion. We propose the Mirror-Filled Fourier Neural Operator (MFNO), the first Fourier neural operator framework incorporating mirror padding to suppress boundary artifacts and enhance modeling capability for non-periodic signals. Theoretically, leveraging a Wong–Zakai-type theorem and multi-level approximation analysis, we rigorously prove that MFNO can approximate the solution operators of both classes of non-Markovian processes to arbitrary accuracy. Empirically, MFNO significantly outperforms LSTM, TCN, and DeepONet in resolution generalization and generates sample paths orders of magnitude faster than conventional numerical solvers, while maintaining high accuracy, strong generalization, and computational efficiency.
📝 Abstract
This paper introduces an operator-based neural network, the mirror-padded Fourier neural operator (MFNO), designed to learn the dynamics of stochastic systems. MFNO extends the standard Fourier neural operator (FNO) by incorporating mirror padding, enabling it to handle non-periodic inputs. We rigorously prove that MFNOs can approximate solutions of path-dependent stochastic differential equations and Lipschitz transformations of fractional Brownian motions to an arbitrary degree of accuracy. Our theoretical analysis builds on Wong--Zakai type theorems and various approximation techniques. Empirically, the MFNO exhibits strong resolution generalization--a property rarely seen in standard architectures such as LSTMs, TCNs, and DeepONet. Furthermore, our model achieves performance that is comparable or superior to these baselines while offering significantly faster sample path generation than classical numerical schemes.