Physics-Informed Chebyshev Polynomial Neural Operator for Parametric Partial Differential Equations

📅 2026-02-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of existing MLP-based physics-informed neural operators in solving parametric partial differential equations (PDEs), which suffer from spectral bias and training instability. The authors propose a mesh-free Chebyshev Polynomial Neural Operator (CPNO), which introduces Chebyshev spectral bases into the neural operator framework for the first time. By replacing unstable monomial expansions with numerically stable orthogonal bases and incorporating a parameter-dependent modulation mechanism, CPNO efficiently constructs PDE solutions within a near-optimal function space. The method achieves near-minimax uniform approximation, significantly improving condition numbers and gradient flow stability. Experiments demonstrate that CPNO attains higher accuracy, faster convergence, and greater hyperparameter robustness across multiple parametric PDE benchmarks, and successfully handles complex geometries such as transonic airfoil flow fields.

Technology Category

Application Category

📝 Abstract
Neural operators have emerged as powerful deep learning frameworks for approximating solution operators of parameterized partial differential equations (PDE). However, current methods predominantly rely on multilayer perceptrons (MLPs) for mapping inputs to solutions, which impairs training robustness in physics-informed settings due to inherent spectral biases and fixed activation functions. To overcome the architectural limitations, we introduce the Physics-Informed Chebyshev Polynomial Neural Operator (CPNO), a novel mesh-free framework that leverages a basis transformation to replace unstable monomial expansions with the numerically stable Chebyshev spectral basis. By integrating parameter dependent modulation mechanism to main net, CPNO constructs PDE solutions in a near-optimal functional space, decoupling the model from MLP-specific constraints and enhancing multi-scale representation. Theoretical analysis demonstrates the Chebyshev basis's near-minimax uniform approximation properties and superior conditioning, with Lebesgue constants growing logarithmically with degree, thereby mitigating spectral bias and ensuring stable gradient flow during optimization. Numerical experiments on benchmark parameterized PDEs show that CPNO achieves superior accuracy, faster convergence, and enhanced robustness to hyperparameters. The experiment of transonic airfoil flow has demonstrated the capability of CPNO in characterizing complex geometric problems.
Problem

Research questions and friction points this paper is trying to address.

neural operators
parametric PDEs
spectral bias
physics-informed learning
numerical stability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Chebyshev Polynomial Neural Operator
Physics-Informed Neural Operator
Spectral Basis
Parametric PDEs
Mesh-Free Learning
🔎 Similar Papers
No similar papers found.