🤖 AI Summary
This work addresses the challenge of efficiently learning state evolution and optimal control for distributed parameter systems governed by partial differential equations. Building upon the Ehrenpreis–Palamodov fundamental principle, it introduces a novel complex-frequency neural operator by extending the frequency variable in Fourier Neural Operators (FNOs) from the real to the complex domain for the first time. By incorporating an integral representation in the complex frequency domain, the proposed method overcomes the limitations of conventional FNOs, which are restricted to periodic boundary conditions and real frequencies, thereby enabling a unified framework for learning both system states and linear-quadratic optimal controls. Experiments on the nonlinear Burgers equation demonstrate that the approach reduces training error by an order of magnitude and substantially improves prediction accuracy under non-periodic boundary conditions.
📝 Abstract
We propose an extended Fourier neural operator (FNO) architecture for learning state and linear quadratic additive optimal control of systems governed by partial differential equations. Using the Ehrenpreis-Palamodov fundamental principle, we show that any state and optimal control of linear PDEs with constant coefficients can be represented as an integral in the complex domain. The integrand of this representation involves the same exponential term as in the inverse Fourier transform, where the latter is used to represent the convolution operator in FNO layer. Motivated by this observation, we modify the FNO layer by extending the frequency variable in the inverse Fourier transform from the real to complex domain to capture the integral representation from the fundamental principle. We illustrate the performance of FNO in learning state and optimal control for the nonlinear Burgers' equation, showing order of magnitude improvements in training errors and more accurate predictions of non-periodic boundary values over FNO.