LNN-PINN: A Unified Physics-Only Training Framework with Liquid Residual Blocks

📅 2025-08-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Physics-informed neural networks (PINNs) suffer from limited prediction accuracy on complex problems. This work proposes a purely architecture-driven unified training framework that enhances hidden-layer mapping capacity without modifying sampling strategies, loss functions, or hyperparameters—only by incorporating a Liquid Residual Gating (LRG) structure. LRG integrates lightweight gating mechanisms with residual connections, boosting model expressivity while preserving the original physics-based modeling and optimization pipeline. Evaluated on four canonical benchmark problems—spanning diverse dimensions, boundary conditions, and differential operators—the method achieves significant reductions in RMSE and MAE, demonstrating robustness, generalization, and cross-problem adaptability. To our knowledge, this is the first work to introduce liquid dynamic gating into PINN architecture design, establishing a novel paradigm for performance improvement under purely physics-based constraints.

Technology Category

Application Category

📝 Abstract
Physics-informed neural networks (PINNs) have attracted considerable attention for their ability to integrate partial differential equation priors into deep learning frameworks; however, they often exhibit limited predictive accuracy when applied to complex problems. To address this issue, we propose LNN-PINN, a physics-informed neural network framework that incorporates a liquid residual gating architecture while preserving the original physics modeling and optimization pipeline to improve predictive accuracy. The method introduces a lightweight gating mechanism solely within the hidden-layer mapping, keeping the sampling strategy, loss composition, and hyperparameter settings unchanged to ensure that improvements arise purely from architectural refinement. Across four benchmark problems, LNN-PINN consistently reduced RMSE and MAE under identical training conditions, with absolute error plots further confirming its accuracy gains. Moreover, the framework demonstrates strong adaptability and stability across varying dimensions, boundary conditions, and operator characteristics. In summary, LNN-PINN offers a concise and effective architectural enhancement for improving the predictive accuracy of physics-informed neural networks in complex scientific and engineering problems.
Problem

Research questions and friction points this paper is trying to address.

Improving predictive accuracy of physics-informed neural networks
Addressing limited performance in complex PDE problems
Enhancing adaptability across dimensions and boundary conditions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Liquid residual gating architecture
Lightweight hidden-layer gating mechanism
Unchanged physics modeling pipeline
🔎 Similar Papers
No similar papers found.