Weak-PDE-Net: Discovering Open-Form PDEs via Differentiable Symbolic Networks and Weak Formulation

📅 2026-03-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of discovering governing partial differential equations (PDEs) from sparse and noisy data, which is hindered by numerical differentiation instability and limited expressiveness of predefined candidate function libraries. To overcome these issues, the authors propose Weak-PDE-Net, an end-to-end differentiable framework that jointly models a weak-form PDE generator and a forward response learner, thereby avoiding explicit numerical differentiation while adaptively capturing system dynamics. The method incorporates differentiable neural architecture search to transcend fixed library constraints, enabling open-ended PDE discovery, and integrates Galilean invariance and symmetry-equivariant priors to enforce physical consistency. Experiments demonstrate that Weak-PDE-Net robustly and accurately recovers the underlying governing equations across multiple complex PDE benchmarks, even under highly sparse and strongly noisy observational conditions.

Technology Category

Application Category

📝 Abstract
Discovering governing Partial Differential Equations (PDEs) from sparse and noisy data is a challenging issue in data-driven scientific computing. Conventional sparse regression methods often suffer from two major limitations: (i) the instability of numerical differentiation under sparse and noisy data, and (ii) the restricted flexibility of a pre-defined candidate library. We propose Weak-PDE-Net, an end-to-end differentiable framework that can robustly identify open-form PDEs. Weak-PDE-Net consists of two interconnected modules: a forward response learner and a weak-form PDE generator. The learner embeds learnable Gaussian kernels within a lightweight MLP, serving as a surrogate model that adaptively captures system dynamics from sparse observations. Meanwhile, the generator integrates a symbolic network with an integral module to construct weak-form PDEs, avoiding explicit numerical differentiation and improving robustness to noise. To relax the constraints of the pre-defined library, we leverage Differentiable Neural Architecture Search strategy during training to explore the functional space, which enables the efficient discovery of open-form PDEs. The capability of Weak-PDE-Net in multivariable systems discovery is further enhanced by incorporating Galilean Invariance constraints and symmetry equivariance hypotheses to ensure physical consistency. Experiments on several challenging PDE benchmarks demonstrate that Weak-PDE-Net accurately recovers governing equations, even under highly sparse and noisy observations.
Problem

Research questions and friction points this paper is trying to address.

Partial Differential Equations
sparse data
noisy data
symbolic discovery
governing equations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Weak formulation
Differentiable Symbolic Networks
Neural Architecture Search
Open-form PDE discovery
Physics-informed constraints
🔎 Similar Papers
No similar papers found.
X
Xinxin Li
School of Mathematical Sciences, East China Normal University, Shang Hai, 200241, China
X
Xingyu Cui
Institute of Applied Physics and Computational Mathematics, 100094, Beijing, China; Shanghai Zhangjiang Institute of Mathematics, 201203, Shanghai, China
J
Jin Qi
Institute of Applied Physics and Computational Mathematics, 100094, Beijing, China
Juan Zhang
Juan Zhang
Department of Mathematics, Xiangtan University
Matrix ComputationNumerical AlgebraNumerical Algorithm
D
Da Li
Academy for Advanced Interdisciplinary Studies, Northeast Normal University, 130024, Changchun, Jilin, China
J
Junping Yin
School of Mathematical Sciences, East China Normal University, Shang Hai, 200241, China; Institute of Applied Physics and Computational Mathematics, 100094, Beijing, China; Shanghai Zhangjiang Institute of Mathematics, 201203, Shanghai, China