Revisiting Deep Information Propagation: Fractal Frontier and Finite-size Effects

📅 2025-08-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates information propagation dynamics in finite-width randomly initialized neural networks, aiming to characterize the geometric nature of the order-to-chaos transition boundary. Using tools from random matrix theory and Fourier-structured transformations, we establish— for the first time—that this phase boundary exhibits a universal fractal structure, independent of input distribution and training dynamics. This finding transcends the infinite-width assumption inherent in classical mean-field theory, demonstrating that fractal criticality is an intrinsic property of finite-scale networks. We extend the analysis to convolutional neural networks (CNNs), confirming identical propagation laws and thereby unifying information propagation characterization across fully connected and convolutional architectures. Moreover, we elucidate how network depth fundamentally governs the trade-off between representation separability and adversarial robustness. Collectively, our work establishes a new paradigm for understanding the essential complexity of finite-width neural networks.

Technology Category

Application Category

📝 Abstract
Information propagation characterizes how input correlations evolve across layers in deep neural networks. This framework has been well studied using mean-field theory, which assumes infinitely wide networks. However, these assumptions break down for practical, finite-size networks. In this work, we study information propagation in randomly initialized neural networks with finite width and reveal that the boundary between ordered and chaotic regimes exhibits a fractal structure. This shows the fundamental complexity of neural network dynamics, in a setting that is independent of input data and optimization. To extend this analysis beyond multilayer perceptrons, we leverage recently introduced Fourier-based structured transforms, and show that information propagation in convolutional neural networks also follow the same behavior. Our investigation highlights the importance of finite network depth with respect to the tradeoff between separation and robustness.
Problem

Research questions and friction points this paper is trying to address.

Study information propagation in finite-width neural networks
Reveal fractal structure in ordered-chaotic regime boundaries
Analyze propagation in CNNs using Fourier-based transforms
Innovation

Methods, ideas, or system contributions that make the work stand out.

Studying finite-width neural networks dynamics
Revealing fractal structure in propagation boundaries
Extending analysis using Fourier-based transforms
🔎 Similar Papers
No similar papers found.