Neural Tangent Kernels and Fisher Information Matrices for Simple ReLU Networks with Random Hidden Weights

πŸ“… 2025-07-24
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This paper investigates the intrinsic relationship between the Neural Tangent Kernel (NTK) and the Fisher Information Matrix (FIM) for two-layer ReLU networks with random hidden-layer weights. Using random matrix theory and spectral analysis, we establish, for the first time, an explicit linear transformation between the NTK and FIM in the infinite-width limit. We further derive the exact spectral decomposition of the NTK, obtaining closed-form expressions for its principal eigenvalues and corresponding eigenfunctions, and reveal its low-rank structure. Leveraging this spectral characterization, we develop an efficient approximation formula for functions representable by the network. Our core contributions are threefold: (1) clarifying the fundamental equivalence between the NTK and FIM; (2) providing an analytically tractable spectral theory for the NTK; and (3) achieving a tight functional-space approximation that precisely characterizes the network’s expressive power.

Technology Category

Application Category

πŸ“ Abstract
Fisher information matrices and neural tangent kernels (NTK) for 2-layer ReLU networks with random hidden weight are argued. We discuss the relation between both notions as a linear transformation and show that spectral decomposition of NTK with concrete forms of eigenfunctions with major eigenvalues. We also obtain an approximation formula of the functions presented by the 2-layer neural networks.
Problem

Research questions and friction points this paper is trying to address.

Analyzing Fisher information matrices and NTK in 2-layer ReLU networks
Exploring spectral decomposition of NTK with eigenfunctions
Deriving approximation formula for 2-layer neural network functions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Analyzes Fisher matrices and NTK for ReLU networks
Links NTK and Fisher via linear transformation
Derives spectral decomposition of NTK eigenvalues
πŸ”Ž Similar Papers
2024-06-10arXiv.orgCitations: 0
Jun'ichi Takeuchi
Jun'ichi Takeuchi
Faculty of Information Science and Electrical Engineering, Kyushu University, Motooka 744, Nishi-ku, Fukuoka, Fukuoka, 819-0395, Japan
Yoshinari Takeishi
Yoshinari Takeishi
Kyushu University
Information theoryMachine learning theory
Noboru Murata
Noboru Murata
Waseda University
statistical learningmachine learning
Kazushi Mimura
Kazushi Mimura
Hiroshima City University
K
Ka Long Keith Ho
Joint Graduate School of Math for Innovation, Kyushu University, Motooka 744, Nishi-ku, Fukuoka, Fukuoka, 819-0395, Japan
H
Hiroshi Nagaoka
Department of Computer and Network Engineering, The University of Electro-Communications, 1-5-1 Chofugaoka, Chofu-city, Tokyo, 182-8585, Japan