Neural Tangent Kernel of Neural Networks with Loss Informed by Differential Operators

📅 2025-03-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the Neural Tangent Kernel (NTK) theory for deep neural networks trained under physics-informed losses involving differential operators, focusing on initialization, training dynamics, convergence, and explicit NTK structure. Method: We establish the first analytical framework for NTK under differential-operator-driven losses, integrating NTK theory, spectral analysis, and Physics-Informed Neural Network (PINN) modeling to rigorously derive the closed-form expression and spectral properties of the NTK. Contribution/Results: We theoretically prove that physics-informed losses do not universally accelerate eigenvalue decay or exacerbate spectral bias; instead, convergence behavior is jointly governed by how differential operators are embedded and the loss structure. Experiments validate the predicted spectral decay rates and bias patterns. This work provides the first systematic NTK-level theoretical explanation for generalization and optimization dynamics in PINNs, extending NTK analysis beyond conventional supervised learning settings.

Technology Category

Application Category

📝 Abstract
Spectral bias is a significant phenomenon in neural network training and can be explained by neural tangent kernel (NTK) theory. In this work, we develop the NTK theory for deep neural networks with physics-informed loss, providing insights into the convergence of NTK during initialization and training, and revealing its explicit structure. We find that, in most cases, the differential operators in the loss function do not induce a faster eigenvalue decay rate and stronger spectral bias. Some experimental results are also presented to verify the theory.
Problem

Research questions and friction points this paper is trying to address.

Develop NTK theory for physics-informed neural networks.
Analyze convergence and structure of NTK during training.
Investigate spectral bias impact from differential operators.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Develop NTK theory for physics-informed loss networks
Analyze NTK convergence during initialization and training
Investigate differential operators' impact on spectral bias
🔎 Similar Papers
No similar papers found.