Spectral bias in physics-informed and operator learning: Analysis and mitigation guidelines

📅 2026-02-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the spectral bias in physics-informed neural networks and neural operators when solving partial differential equations, wherein low-frequency components converge preferentially, hindering accurate modeling of high-frequency details. The study systematically demonstrates that this bias stems from optimization dynamics rather than merely limitations in representational capacity. To mitigate this issue, the authors propose a novel mechanism combining second-order optimization with a spectrum-aware loss function. Leveraging diagnostic tools such as frequency-decomposed error analysis, Barron norms, and higher-order statistical moments, the method significantly improves early-stage recovery accuracy of high-frequency modes across benchmark problems—including the KdV equation, wave equation, and turbulent flow reconstruction—without incurring additional inference overhead.

Technology Category

Application Category

📝 Abstract
Solving partial differential equations (PDEs) by neural networks as well as Kolmogorov-Arnold Networks (KANs), including physics-informed neural networks (PINNs), physics-informed KANs (PIKANs), and neural operators, are known to exhibit spectral bias, whereby low-frequency components of the solution are learned significantly faster than high-frequency modes. While spectral bias is often treated as an intrinsic representational limitation of neural architectures, its interaction with optimization dynamics and physics-based loss formulations remains poorly understood. In this work, we provide a systematic investigation of spectral bias in physics-informed and operator learning frameworks, with emphasis on the coupled roles of network architecture, activation functions, loss design, and optimization strategy. We quantify spectral bias through frequency-resolved error metrics, Barron-norm diagnostics, and higher-order statistical moments, enabling a unified analysis across elliptic, hyperbolic, and dispersive PDEs. Through diverse benchmark problems, including the Korteweg-de Vries, wave and steady-state diffusion-reaction equations, turbulent flow reconstruction, and earthquake dynamics, we demonstrate that spectral bias is not simply representational but fundamentally dynamical. In particular, second-order optimization methods substantially alter the spectral learning order, enabling earlier and more accurate recovery of high-frequency modes for all PDE types. For neural operators, we further show that spectral bias is dependent on the neural operator architecture and can also be effectively mitigated through spectral-aware loss formulations without increasing the inference cost.
Problem

Research questions and friction points this paper is trying to address.

spectral bias
physics-informed learning
operator learning
partial differential equations
frequency learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

spectral bias
physics-informed learning
neural operators
second-order optimization
frequency-resolved analysis
🔎 Similar Papers
No similar papers found.
S
Siavash Khodakarami
Division of Applied Mathematics, Brown University, Providence, RI 02912, USA
Vivek Oommen
Vivek Oommen
Brown University
Scientific Machine LearningFluid MechanicsHeat TransferMaterial Science
N
Nazanin Ahmadi Daryakenari
School of Engineering, Brown University, Providence, RI 02912, USA
M
Maxim Beekenkamp
Division of Applied Mathematics, Brown University, Providence, RI 02912, USA
George Em Karniadakis
George Em Karniadakis
The Charles Pitts Robinson and John Palmer Barstow Professor of Applied Mathematics and Engineering
Math+Machine LearningProbabilistic Scientific ComputingStochastic Multiscale Modeling