🤖 AI Summary
This work addresses the challenge of discovering governing partial differential equations (PDEs) from sparse and noisy data, which is hindered by numerical differentiation instability and limited expressiveness of predefined candidate function libraries. To overcome these issues, the authors propose Weak-PDE-Net, an end-to-end differentiable framework that jointly models a weak-form PDE generator and a forward response learner, thereby avoiding explicit numerical differentiation while adaptively capturing system dynamics. The method incorporates differentiable neural architecture search to transcend fixed library constraints, enabling open-ended PDE discovery, and integrates Galilean invariance and symmetry-equivariant priors to enforce physical consistency. Experiments demonstrate that Weak-PDE-Net robustly and accurately recovers the underlying governing equations across multiple complex PDE benchmarks, even under highly sparse and strongly noisy observational conditions.
📝 Abstract
Discovering governing Partial Differential Equations (PDEs) from sparse and noisy data is a challenging issue in data-driven scientific computing. Conventional sparse regression methods often suffer from two major limitations: (i) the instability of numerical differentiation under sparse and noisy data, and (ii) the restricted flexibility of a pre-defined candidate library. We propose Weak-PDE-Net, an end-to-end differentiable framework that can robustly identify open-form PDEs. Weak-PDE-Net consists of two interconnected modules: a forward response learner and a weak-form PDE generator. The learner embeds learnable Gaussian kernels within a lightweight MLP, serving as a surrogate model that adaptively captures system dynamics from sparse observations. Meanwhile, the generator integrates a symbolic network with an integral module to construct weak-form PDEs, avoiding explicit numerical differentiation and improving robustness to noise. To relax the constraints of the pre-defined library, we leverage Differentiable Neural Architecture Search strategy during training to explore the functional space, which enables the efficient discovery of open-form PDEs. The capability of Weak-PDE-Net in multivariable systems discovery is further enhanced by incorporating Galilean Invariance constraints and symmetry equivariance hypotheses to ensure physical consistency. Experiments on several challenging PDE benchmarks demonstrate that Weak-PDE-Net accurately recovers governing equations, even under highly sparse and noisy observations.