- A Lightweight Transformer for Faster and Robust EBSD Data Collection (Scientific Reports, 2024)
- Fast and Provable Tensor Robust Principal Component Analysis via Scaled Gradient Descent (Information and Inference: A Journal of the IMA, 2023)
Awards:
- Wei Shen and Xuehong Zhang Presidential Fellowship (2024)
- Liang Ji-Dian Graduate Fellowship (2023)
- Michel and Kathy Doreau Graduate Fellowship (2023)
- NSF GRFP Honorable Mention (2023)
- UC Berkeley High Distinction (2021)
Research Experience
Currently a PhD student in the Department of Electrical and Computer Engineering at CMU, focusing on efficient machine learning algorithms, LLM inference efficiency/scaling, and applications of LLMs and diffusion models to challenging science problems, particularly in materials science. Part-time research intern at Meta, collaborating frequently with Beidi Chen and the team at AFRL. Previously worked on various provable optimization methods for estimation, traffic routing, and neuroscience.
Education
PhD: Carnegie Mellon University, Electrical and Computer Engineering, Advisor: Professor Yuejie Chi, 2021-Present; Undergraduate: UC Berkeley, Statistics and Computer Science, Graduated 2021.
Background
Research Interests: Efficient machine learning algorithms, particularly in LLM inference efficiency and scaling. Professional Field: Electrical and Computer Engineering (ECE). Background: 5th-year/final year ECE PhD student at Carnegie Mellon University (CMU), advised by Professor Yuejie Chi. Currently, a part-time research intern at Meta. Collaborates frequently with Beidi Chen and folks at AFRL. Graduated from UC Berkeley with degrees in statistics and computer science in 2021.