🤖 AI Summary
Data-driven discovery of partial differential equations (PDEs) remains challenging due to structural ambiguity and sensitivity to noise and data scale.
Method: We propose an adjoint-based parametric modeling framework for PDE discovery. A sparse candidate library—comprising linear/nonlinear terms and spatial derivatives—is used to parameterize the PDE form, yielding a PDE-constrained optimization problem. For the first time, we systematically derive the adjoint equations for general parametric PDE families via variational calculus, enabling machine-precision analytical gradient computation.
Contribution/Results: Our method significantly outperforms sparse regression approaches (e.g., PDE-FIND) in structural identification accuracy and noise robustness across diverse PDEs—including Burgers, KdV, and reaction-diffusion equations—especially under high noise levels and large-scale data. Integrated forward and adjoint numerical solvers ensure efficient training, while analytical gradients substantially accelerate optimization convergence.
📝 Abstract
In this work, we present an adjoint-based method for discovering the underlying governing partial differential equations (PDEs) given data. The idea is to consider a parameterized PDE in a general form and formulate a PDE-constrained optimization problem aimed at minimizing the error of the PDE solution from data. Using variational calculus, we obtain an evolution equation for the Lagrange multipliers (adjoint equations) allowing us to compute the gradient of the objective function with respect to the parameters of PDEs given data in a straightforward manner. In particular, we consider a family of parameterized PDEs encompassing linear, nonlinear, and spatial derivative candidate terms, and elegantly derive the corresponding adjoint equations. We show the efficacy of the proposed approach in identifying the form of the PDE up to machine accuracy, enabling the accurate discovery of PDEs from data. We also compare its performance with the famous PDE Functional Identification of Nonlinear Dynamics method known as PDE-FIND (Rudy et al., 2017), on both smooth and noisy data sets. Even though the proposed adjoint method relies on forward/backward solvers, it outperforms PDE-FIND for large data sets thanks to the analytic expressions for gradients of the cost function with respect to each PDE parameter.