🤖 AI Summary
This work addresses the challenge of rigorously satisfying partial differential equations (PDEs) at the construction level when solving initial-boundary value problems (IBVPs). We propose a novel solution framework grounded in Lie symmetry theory. By performing explicit Lie symmetry analysis, the method intrinsically embeds the PDE’s invariance structure into the model architecture, thereby internalizing physical laws as structural priors and ensuring that learned solutions automatically satisfy the governing equations. Combining symmetry reduction with PDE-constrained optimization, the framework directly learns symmetry-invariant solutions from initial and boundary data. Compared to physics-informed neural networks (PINNs), our approach achieves higher accuracy, faster convergence, and more compact model representations on diverse linear homogeneous PDEs. Moreover, it enables rigorous a posteriori error estimation, significantly enhancing predictive reliability and interpretability.
📝 Abstract
We introduce a method for efficiently solving initial-boundary value problems (IBVPs) that uses Lie symmetries to enforce the associated partial differential equation (PDE) exactly by construction. By leveraging symmetry transformations, the model inherently incorporates the physical laws and learns solutions from initial and boundary data. As a result, the loss directly measures the model's accuracy, leading to improved convergence. Moreover, for well-posed IBVPs, our method enables rigorous error estimation. The approach yields compact models, facilitating an efficient optimization. We implement LieSolver and demonstrate its application to linear homogeneous PDEs with a range of initial conditions, showing that it is faster and more accurate than physics-informed neural networks (PINNs). Overall, our method improves both computational efficiency and the reliability of predictions for PDE-constrained problems.