Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
Published papers include 'Global Convergence of Gradient EM for Over-Parameterized Gaussian Mixtures,' 'How Does Gradient Descent Learn Features -- A Local Analysis for Regularized Two-Layer Neural Networks,' and many others in top conferences such as NeurIPS, ICML, and ICLR.
Research Experience
Worked with Prof. Tengyu Ma at Stanford in summer 2023; was an applied science intern at AWS AI in summer 2022; interned in Industrial and Systems Engineering (ISyE) at Georgia Tech, working with Prof. Tuo Zhao in summer 2018.
Education
Ph.D. in Computer Science from Duke University (2019-2024), advised by Rong Ge; B.S. in Statistics from Peking University (2015-2019).
Background
Research interests: optimization and theoretical machine learning, with a particular focus on deep learning theory. Currently a postdoc at the Institute for Foundations of Data Science (IFDS), University of Washington.
Miscellany
Served as a reviewer for several top-tier academic conferences including ICML, ICLR, NeurIPS, JMLR, Mathematical Programming, and STOC.