Spectral Analysis of Hard-Constraint PINNs: The Spatial Modulation Mechanism of Boundary Functions

📅 2025-12-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work identifies a critical mechanism wherein boundary functions in hard-constraint physics-informed neural networks (HC-PINNs) significantly impede training dynamics via spatial multiplicative modulation. To address the resulting optimization stagnation, we establish— for the first time—the neural tangent kernel (NTK) theoretical framework for HC-PINNs, proving that the boundary function (B(mathbf{x})) acts as a low-pass filter on the NTK spectrum, inducing spectral collapse and deteriorating the condition number. Building on this insight, we propose “effective rank” as a more robust convergence criterion and devise a spectrum-aware boundary function construction strategy based on the trial function ( ilde{u} = A + B cdot N). Extensive evaluation on multidimensional PDE benchmarks demonstrates that our approach enables quantitative spectral control, eliminates optimization stagnation, and achieves stable, rapid convergence.

Technology Category

Application Category

📝 Abstract
Physics-Informed Neural Networks with hard constraints (HC-PINNs) are increasingly favored for their ability to strictly enforce boundary conditions via a trial function ansatz $ ilde{u} = A + B cdot N$, yet the theoretical mechanisms governing their training dynamics have remained unexplored. Unlike soft-constrained formulations where boundary terms act as additive penalties, this work reveals that the boundary function $B$ introduces a multiplicative spatial modulation that fundamentally alters the learning landscape. A rigorous Neural Tangent Kernel (NTK) framework for HC-PINNs is established, deriving the explicit kernel composition law. This relationship demonstrates that the boundary function $B(vec{x})$ functions as a spectral filter, reshaping the eigenspectrum of the neural network's native kernel. Through spectral analysis, the effective rank of the residual kernel is identified as a deterministic predictor of training convergence, superior to classical condition numbers. It is shown that widely used boundary functions can inadvertently induce spectral collapse, leading to optimization stagnation despite exact boundary satisfaction. Validated across multi-dimensional benchmarks, this framework transforms the design of boundary functions from a heuristic choice into a principled spectral optimization problem, providing a solid theoretical foundation for geometric hard constraints in scientific machine learning.
Problem

Research questions and friction points this paper is trying to address.

Analyzes how boundary functions modulate training dynamics in hard-constraint PINNs
Establishes a Neural Tangent Kernel framework to reveal spectral filtering effects
Transforms boundary function design into a principled spectral optimization problem
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hard-constrained PINNs enforce boundaries via multiplicative spatial modulation
Boundary function acts as spectral filter reshaping kernel eigenspectrum
Framework transforms boundary design into principled spectral optimization problem
🔎 Similar Papers
No similar papers found.
Y
Yuchen Xie
Sino–French Institute of Nuclear Engineering and Technology, Sun Yat-sen University, Zhuhai, 519082, PR China
H
Honghang Chi
Sino–French Institute of Nuclear Engineering and Technology, Sun Yat-sen University, Zhuhai, 519082, PR China
H
Haopeng Quan
Sino–French Institute of Nuclear Engineering and Technology, Sun Yat-sen University, Zhuhai, 519082, PR China
Y
Yahui Wang
Sino–French Institute of Nuclear Engineering and Technology, Sun Yat-sen University, Zhuhai, 519082, PR China; CNPRI-SYSU Joint Research Center of Coolant Chemistry for Nuclear Reactor, Zhuhai, 519082, P.R. China
W
Wei Wang
Sino–French Institute of Nuclear Engineering and Technology, Sun Yat-sen University, Zhuhai, 519082, PR China
Yu Ma
Yu Ma
Indiana University
Computer Science