Bayesian RG Flow in Neural Network Field Theories

📅 2024-05-27
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the long-standing challenge of establishing a unified theoretical framework bridging neural networks (NNs) and statistical field theory (SFT), specifically by formalizing architectural mappings and characterizing training dynamics within a field-theoretic language. We propose the Bayesian Renormalization Group–Neural Network Field Theory (BRG-NNFT) paradigm—the first integration of BRG methods into NN field theory—enabling rigorous correspondence between training dynamics and scale-dependent field flows in information space. Innovatively, we construct a bidirectional coarse-graining flow in information space and prove that BRG reduces to the exact renormalization group (ERG) when information cutoff matches momentum cutoff. We analytically derive the BRG flow for arbitrary-depth, infinite-width networks and rigorously show that single-layer cos-net activations reproduce the ERG flow of a free scalar field. Numerical experiments confirm that information-shell BRG enables efficient post-training parameter renormalization.

Technology Category

Application Category

📝 Abstract
The Neural Network Field Theory correspondence (NNFT) is a mapping from neural network (NN) architectures into the space of statistical field theories (SFTs). The Bayesian renormalization group (BRG) is an information-theoretic coarse graining scheme that generalizes the principles of the exact renormalization group (ERG) to arbitrarily parameterized probability distributions, including those of NNs. In BRG, coarse graining is performed in parameter space with respect to an information-theoretic distinguishability scale set by the Fisher information metric. In this paper, we unify NNFT and BRG to form a powerful new framework for exploring the space of NNs and SFTs, which we coin BRG-NNFT. With BRG-NNFT, NN training dynamics can be interpreted as inducing a flow in the space of SFTs from the information-theoretic `IR' $ ightarrow$ `UV'. Conversely, applying an information-shell coarse graining to the trained network's parameters induces a flow in the space of SFTs from the information-theoretic `UV' $ ightarrow$ `IR'. When the information-theoretic cutoff scale coincides with a standard momentum scale, BRG is equivalent to ERG. We demonstrate the BRG-NNFT correspondence on two analytically tractable examples. First, we construct BRG flows for trained, infinite-width NNs, of arbitrary depth, with generic activation functions. As a special case, we then restrict to architectures with a single infinitely-wide layer, scalar outputs, and generalized cos-net activations. In this case, we show that BRG coarse-graining corresponds exactly to the momentum-shell ERG flow of a free scalar SFT. Our analytic results are corroborated by a numerical experiment in which an ensemble of asymptotically wide NNs are trained and subsequently renormalized using an information-shell BRG scheme.
Problem

Research questions and friction points this paper is trying to address.

Unifies Neural Network Field Theory and Bayesian Renormalization Group.
Interprets NN training dynamics as flow in statistical field theories.
Demonstrates BRG-NNFT correspondence with analytic and numerical examples.
Innovation

Methods, ideas, or system contributions that make the work stand out.

NNFT maps neural networks to field theories
BRG generalizes renormalization via Fisher metric
BRG-NNFT unifies neural and field theory dynamics
🔎 Similar Papers
2024-05-09Machine Learning: Science and TechnologyCitations: 3
Jessica N. Howard
Jessica N. Howard
Kavli Institute for Theoretical Physics, Santa Barbara, CA USA
M
Marc S. Klinger
Department of Physics, University of Illinois, Urbana IL 61801, USA
Anindita Maiti
Anindita Maiti
Perimeter Institute for Theoretical Physics
Artificial IntelligenceDeep LearningTheoretical PhysicsField Theory
A
A. G. Stapleton
Centre for Theoretical Physics, Queen Mary University of London, Mile End Road, London E1 4NS, U.K.