SFO: Learning PDE Operators via Spectral Filtering

📅 2026-01-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing neural operators struggle to efficiently model the long-range, nonlocal interactions inherent in the solution mappings of partial differential equations (PDEs). This work proposes the Spectral Filtering Operator (SFO), which leverages the insight that the discretized Green’s functions of translation-invariant PDEs exhibit a spatial linear dynamical system structure. Building upon this, SFO constructs a Universal Spectral Basis (USB) to compactly represent integral kernels. By learning only a small number of spectral coefficients corresponding to rapidly decaying eigenvalues, SFO achieves significantly improved approximation efficiency. Evaluated across six benchmarks spanning reaction–diffusion systems, fluid dynamics, and three-dimensional electromagnetics, SFO attains state-of-the-art accuracy—reducing errors by up to 40% compared to strong baselines—while substantially decreasing the number of parameters.

Technology Category

Application Category

📝 Abstract
Partial differential equations (PDEs) govern complex systems, yet neural operators often struggle to efficiently capture the long-range, nonlocal interactions inherent in their solution maps. We introduce Spectral Filtering Operator (SFO), a neural operator that parameterizes integral kernels using the Universal Spectral Basis (USB), a fixed, global orthonormal basis derived from the eigenmodes of the Hilbert matrix in spectral filtering theory. Motivated by our theoretical finding that the discrete Green's functions of shift-invariant PDE discretizations exhibit spatial Linear Dynamical System (LDS) structure, we prove that these kernels admit compact approximations in the USB. By learning only the spectral coefficients of rapidly decaying eigenvalues, SFO achieves a highly efficient representation. Across six benchmarks, including reaction-diffusion, fluid dynamics, and 3D electromagnetics, SFO achieves state-of-the-art accuracy, reducing error by up to 40% relative to strong baselines while using substantially fewer parameters.
Problem

Research questions and friction points this paper is trying to address.

neural operators
partial differential equations
nonlocal interactions
long-range dependencies
solution maps
Innovation

Methods, ideas, or system contributions that make the work stand out.

Spectral Filtering Operator
Universal Spectral Basis
Neural Operators
Linear Dynamical System
Green's Functions
🔎 Similar Papers
No similar papers found.
N
Noam Koren
Department of Computer Science, Technion - Israel Institute of Technology, Haifa, Israel
R
Rafael Moschopoulos
Department of Computer Science, Princeton University, Princeton, New Jersey, USA
Kira Radinsky
Kira Radinsky
Technion
Elad Hazan
Elad Hazan
Professor at Princeton University and Director Google AI Princeton
Machine LearningMathematical Optimization