Alex Damian
Scholar

Alex Damian

Google Scholar ID: dymr3s8AAAAJ
Harvard University
Citations & Impact
All-time
Citations
1,948
 
H-index
11
 
i10-index
12
 
Publications
14
 
Co-authors
0
 
Resume (English only)
Academic Achievements
  • Published multiple research papers on deep learning optimization dynamics, representation learning, and computational-to-statistical gaps. Some of the papers include:
  • - The Generative Leap: Sharp Sample Complexity for Efficiently Learning Gaussian Multi-Index Models (NeurIPS 2025)
  • - Learning Compositional Functions with Transformers from Easy-to-Hard Data (COLT 2025)
  • - Understanding Optimization in Deep Learning with Central Flows (ICLR 2025)
  • - Computational-Statistical Gaps in Gaussian Single-Index Models (COLT 2024)
  • - How Transformers Learn Causal Structure with Gradient Descent (ICML 2024)
  • - Smoothing the Landscape Boosts the Signal for SGD: Optimal Sample Complexity for Learning Single Index Models (NeurIPS 2023)
  • - Provable Guarantees for Nonlinear Feature Learning in Three-Layer Neural Networks (NeurIPS 2023)
  • - Self-Stabilization: The Implicit Bias of Gradient Descent at the Edge of Stability (ICLR 2023)
Research Experience
  • Focused on the mathematical foundations of deep learning, with extensive research in deep learning optimization dynamics, representation learning in simple models, and computational-to-statistical gaps. Published papers in several conferences such as NeurIPS, COLT, ICLR, and ICML.
Education
  • Received Ph.D. in Applied and Computational Mathematics from Princeton University under the supervision of Jason D. Lee; B.S. in Mathematics from Duke University, where he worked with Cynthia Rudin and Hau-Tieng Wu.
Background
  • Currently a Kempner Research Fellow at the Kempner Institute at Harvard University. In Fall 2026, will start as an Assistant Professor at MIT with a shared appointment between Mathematics and EECS[AI+D]. Research interests include the mathematical foundations of deep learning, particularly deep learning optimization dynamics, representation learning in simple models, and computational-to-statistical gaps.
Miscellany
  • Actively looking for students starting in Fall 2026. Interested applicants should apply to either the Mathematics or EECS departments at MIT and list his name in their application.
Co-authors
0 total
Co-authors: 0 (list not available)