Fourier PINNs: From Strong Boundary Conditions to Adaptive Fourier Bases

📅 2024-10-04
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Physics-informed neural networks (PINNs) struggle to accurately resolve high-frequency and multiscale solutions of partial differential equations (PDEs). Method: We propose Fourier-enhanced PINNs (FE-PINNs), which explicitly embed a dense, predefined Fourier basis into the network architecture and introduce an adaptive frequency selection mechanism based on alternating optimization—bypassing conventional hard boundary condition enforcement and enabling flexible handling of arbitrary boundaries and complex geometries. Our approach integrates Fourier analysis, the convolution theorem, and coefficient truncation. Contributions/Results: FE-PINNs significantly improve high-frequency component fidelity, achieving systematically lower relative errors than standard PINNs across benchmarks. Moreover, they more accurately reconstruct the power spectral structure of target solutions. This yields an interpretable, robust, mesh-free paradigm for multiscale physical modeling.

Technology Category

Application Category

📝 Abstract
Interest is rising in Physics-Informed Neural Networks (PINNs) as a mesh-free alternative to traditional numerical solvers for partial differential equations (PDEs). However, PINNs often struggle to learn high-frequency and multi-scale target solutions. To tackle this problem, we first study a strong Boundary Condition (BC) version of PINNs for Dirichlet BCs and observe a consistent decline in relative error compared to the standard PINNs. We then perform a theoretical analysis based on the Fourier transform and convolution theorem. We find that strong BC PINNs can better learn the amplitudes of high-frequency components of the target solutions. However, constructing the architecture for strong BC PINNs is difficult for many BCs and domain geometries. Enlightened by our theoretical analysis, we propose Fourier PINNs -- a simple, general, yet powerful method that augments PINNs with pre-specified, dense Fourier bases. Our proposed architecture likewise learns high-frequency components better but places no restrictions on the particular BCs or problem domains. We develop an adaptive learning and basis selection algorithm via alternating neural net basis optimization, Fourier and neural net basis coefficient estimation, and coefficient truncation. This scheme can flexibly identify the significant frequencies while weakening the nominal frequencies to better capture the target solution's power spectrum. We show the advantage of our approach through a set of systematic experiments.
Problem

Research questions and friction points this paper is trying to address.

Improve high-frequency solution learning in PINNs
Address strong boundary condition challenges
Develop adaptive Fourier basis selection algorithm
Innovation

Methods, ideas, or system contributions that make the work stand out.

Strong Boundary Condition PINNs
Adaptive Fourier Bases
Neural Net Basis Optimization
🔎 Similar Papers
No similar papers found.