Sensitivity-Constrained Fourier Neural Operators for Forward and Inverse Problems in Parametric Differential Equations

📅 2025-05-13
🏛️ International Conference on Learning Representations
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses forward and inverse problems for parametric differential equations, identifying key limitations of Fourier Neural Operators (FNOs)—including physical inconsistency and poor generalization—in sensitivity estimation (∂u/∂p), parameter inversion, and concept drift scenarios. To overcome these bottlenecks, we propose a parameter-sensitivity-constrained regularization framework that explicitly incorporates ∂u/∂p into the FNO training objective for the first time, synergistically integrating physics-informed constraints with FNO’s spectral-domain modeling capability. Our method achieves high-accuracy solution-path prediction and parameter inversion in an 82-dimensional parameter space, significantly reducing data and training requirements. Computational overhead per epoch increases by only 30–130%, while maintaining consistent performance across diverse PDE types and operator architectures. This establishes a new paradigm for parametric PDE learning that is both interpretable and robust.

Technology Category

Application Category

📝 Abstract
Parametric differential equations of the form du/dt = f(u, x, t, p) are fundamental in science and engineering. While deep learning frameworks such as the Fourier Neural Operator (FNO) can efficiently approximate solutions, they struggle with inverse problems, sensitivity estimation (du/dp), and concept drift. We address these limitations by introducing a sensitivity-based regularization strategy, called Sensitivity-Constrained Fourier Neural Operators (SC-FNO). SC-FNO achieves high accuracy in predicting solution paths and consistently outperforms standard FNO and FNO with physics-informed regularization. It improves performance in parameter inversion tasks, scales to high-dimensional parameter spaces (tested with up to 82 parameters), and reduces both data and training requirements. These gains are achieved with a modest increase in training time (30% to 130% per epoch) and generalize across various types of differential equations and neural operators. Code and selected experiments are available at: https://github.com/AMBehroozi/SC_Neural_Operators
Problem

Research questions and friction points this paper is trying to address.

Improves accuracy in solving parametric differential equations
Enhances performance in parameter inversion tasks
Reduces data and training requirements for neural operators
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sensitivity-based regularization for Fourier Neural Operators
Improves accuracy in solution path prediction
Scales to high-dimensional parameter spaces
🔎 Similar Papers
No similar papers found.