🤖 AI Summary
This study investigates how overparameterization reshapes high-dimensional nonconvex loss landscapes—particularly its impact on the information acquisition capability of neural network initialization in teacher-student models. Using field-theoretic analysis of the Hessian spectrum at initialization, complemented by numerical simulations, we uncover a Baik–Ben Arous–Péché (BBP) phase transition governing the interplay between initialization informativeness, sample size, and signal-to-noise ratio (SNR). Key contributions: (i) overparameterization bends the loss landscape and shifts the BBP transition point, enabling recovery arbitrarily close to the information-theoretic weak-recovery threshold; (ii) we distinguish continuous from discontinuous phase transitions for the first time and propose a novel learnability threshold in the low-SNR regime; (iii) theory and simulations show excellent agreement, and strong finite-size effects enable partial signal recovery even below the BBP threshold.
📝 Abstract
High-dimensional non-convex loss landscapes play a central role in the theory of Machine Learning. Gaining insight into how these landscapes interact with gradient-based optimization methods, even in relatively simple models, can shed light on this enigmatic feature of neural networks. In this work, we will focus on a prototypical simple learning problem, which generalizes the Phase Retrieval inference problem by allowing the exploration of overparametrized settings. Using techniques from field theory, we analyze the spectrum of the Hessian at initialization and identify a Baik-Ben Arous-Péché (BBP) transition in the amount of data that separates regimes where the initialization is informative or uninformative about a planted signal of a teacher-student setup. Crucially, we demonstrate how overparameterization can bend the loss landscape, shifting the transition point, even reaching the information-theoretic weak-recovery threshold in the large overparameterization limit, while also altering its qualitative nature. We distinguish between continuous and discontinuous BBP transitions and support our analytical predictions with simulations, examining how they compare to the finite-N behavior. In the case of discontinuous BBP transitions strong finite-N corrections allow the retrieval of information at a signal-to-noise ratio (SNR) smaller than the predicted BBP transition. In these cases we provide estimates for a new lower SNR threshold that marks the point at which initialization becomes entirely uninformative.