Understanding multimodal contrastive learning and incorporating unpaired data. (AISTATS 2023).
Importance Tempering: Group Robustness for Overparameterized Models. arXiv preprint arXiv:2209.08745, 2022. Under review.
The Power of Contrast for Feature Learning: A Theoretical Analysis. (JMLR 2023).
An Unconstrained Layer Peeled Perspective on Neural Collapse. (ICLR 2022).
Research Experience
Conducted multiple research projects during his PhD, covering large language models, predictions as surrogates, scaling laws for the value of individual data points, covariate-assisted inference, and more.
Education
PhD student at the Department of Statistics, Stanford University, advised by Prof. Lihua Lei; Bachelor's degree from the School of Mathematical Sciences at Peking University, worked with Prof. Bin Dong, Prof. Weijie Su, Prof. Linjun Zhang, and Prof. James Zou.
Background
Broadly interested in statistics, machine learning, and economics, focusing on establishing the theoretical foundation and developing statistical tools for real-world problems.