Mo Zhou
Scholar

Mo Zhou

Google Scholar ID: j_SEFF8AAAAJ
University of Washington
Machine LearningOptimization
Citations & Impact
All-time
Citations
376
 
H-index
8
 
i10-index
7
 
Publications
12
 
Co-authors
0
 
Contact
No contact links provided.
Publications
12 items
Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
  • Published papers include 'Global Convergence of Gradient EM for Over-Parameterized Gaussian Mixtures,' 'How Does Gradient Descent Learn Features -- A Local Analysis for Regularized Two-Layer Neural Networks,' and many others in top conferences such as NeurIPS, ICML, and ICLR.
Research Experience
  • Worked with Prof. Tengyu Ma at Stanford in summer 2023; was an applied science intern at AWS AI in summer 2022; interned in Industrial and Systems Engineering (ISyE) at Georgia Tech, working with Prof. Tuo Zhao in summer 2018.
Education
  • Ph.D. in Computer Science from Duke University (2019-2024), advised by Rong Ge; B.S. in Statistics from Peking University (2015-2019).
Background
  • Research interests: optimization and theoretical machine learning, with a particular focus on deep learning theory. Currently a postdoc at the Institute for Foundations of Data Science (IFDS), University of Washington.
Miscellany
  • Served as a reviewer for several top-tier academic conferences including ICML, ICLR, NeurIPS, JMLR, Mathematical Programming, and STOC.
Co-authors
0 total
Co-authors: 0 (list not available)