Deep Learning of Compositional Targets with Hierarchical Spectral Methods

📅 2026-02-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates whether deep models possess an intrinsic advantage in learning high-dimensional compositional target functions. Under a high-dimensional Gaussian setting, the authors propose a three-layer hierarchical spectral estimator that uncovers latent intermediate structures in the input through staged learning, thereby decomposing a complex task into a sequence of simplified spectral estimation problems. By leveraging an explicit three-layer model and Gaussian universality theory, they provide the first rigorous proof that depth substantially reduces sample complexity, establishing a clear separation from shallow methods. The results demonstrate that the three-layer strategy achieves markedly superior sample efficiency compared to two-layer approaches, offering stringent theoretical evidence—within a controlled setting—for the inherent benefits of depth in learning.

Technology Category

Application Category

📝 Abstract
Why depth yields a genuine computational advantage over shallow methods remains a central open question in learning theory. We study this question in a controlled high-dimensional Gaussian setting, focusing on compositional target functions. We analyze their learnability using an explicit three-layer fitting model trained via layer-wise spectral estimators. Although the target is globally a high-degree polynomial, its compositional structure allows learning to proceed in stages: an intermediate representation reveals structure that is inaccessible at the input level. This reduces learning to simpler spectral estimation problems, well studied in the context of multi-index models, whereas any shallow estimator must resolve all components simultaneously. Our analysis relies on Gaussian universality, leading to sharp separations in sample complexity between two and three-layer learning strategies.
Problem

Research questions and friction points this paper is trying to address.

deep learning
compositional functions
sample complexity
spectral methods
hierarchical learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

compositional targets
hierarchical spectral methods
depth separation
sample complexity
Gaussian universality
🔎 Similar Papers
No similar papers found.