Sharper Generalization Bounds for Transformer

📅 2026-03-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work establishes tight, architecture-dependent generalization error bounds for Transformer models. By integrating shifted Rademacher complexity with structural properties of the model—such as matrix rank, norms, and empirical covering numbers—it derives the first explicit generalization bounds tailored to single-layer single-head, single-layer multi-head, and multi-layer Transformers. The approach dispenses with the conventional assumption of bounded input features, thereby accommodating unbounded inputs and heavy-tailed distributions. The resulting bounds achieve the theoretically optimal convergence rate, significantly refining the characterization of the generalization capability of Transformer architectures.

Technology Category

Application Category

📝 Abstract
This paper studies generalization error bounds for Transformer models. Based on the offset Rademacher complexity, we derive sharper generalization bounds for different Transformer architectures, including single-layer single-head, single-layer multi-head, and multi-layer Transformers. We first express the excess risk of Transformers in terms of the offset Rademacher complexity. By exploiting its connection with the empirical covering numbers of the corresponding hypothesis spaces, we obtain excess risk bounds that achieve optimal convergence rates up to constant factors. We then derive refined excess risk bounds by upper bounding the covering numbers of Transformer hypothesis spaces using matrix ranks and matrix norms, leading to precise, architecture-dependent generalization bounds. Finally, we relax the boundedness assumption on feature mappings and extend our theoretical results to settings with unbounded (sub-Gaussian) features and heavy-tailed distributions.
Problem

Research questions and friction points this paper is trying to address.

generalization bounds
Transformer
Rademacher complexity
covering numbers
excess risk
Innovation

Methods, ideas, or system contributions that make the work stand out.

offset Rademacher complexity
generalization bounds
Transformer architecture
covering numbers
matrix norms
🔎 Similar Papers
No similar papers found.
Yawen Li
Yawen Li
Lawrence Technological University
Biomaterialstissue engineeringBioMEMS
Tao Hu
Tao Hu
School of Mathematical Sciences, Capital Normal University
Survival AnslysisRobust Statistics
Zhouhui Lian
Zhouhui Lian
Peking University
Computer GraphicsComputer VisionAI
W
Wan Tian
Wangxuan Institute of Computer Technology, Peking University, China, 100871; Advanced Institute of Information Technology, Peking University
Yijie Peng
Yijie Peng
Peking University
SimulationBayesian LearningArtificial IntelligenceHealthcareFinancial Engineering
H
Huiming Zhang
School of Artificial Intelligence, Beihang University, 100191 Beijing, China
Z
Zhongyi Li
School of Artificial Intelligence, Beihang University, 100191 Beijing, China