International Conference on Learning Representations · 2023
Cited
59
Resume (English only)
Academic Achievements
Published multiple papers in top conferences such as NeurIPS, ICLR, ICML, including 'Preference Learning Algorithms Do Not Learn Preference Rankings', 'Sudden Drops in the Loss: Syntax Acquisition, Phase Transitions, and Simplicity Bias in MLMs', etc.
Research Experience
PhD intern at Google Research on streaming models; student researcher at Google Brain on evolution with LLMs; research intern at Prescient Design on LLMs for discrete sequence optimization; upcoming Senior Research Scientist at Google DeepMind; Visiting Researcher at Meta AI NYC; part-time ML Scientist at Prescient Design.
Education
PhD: NYU Center for Data Science, Machine Learning for Language group, advised by Kyunghyun Cho; Bachelor's degree: Princeton Computer Science, senior thesis advised by Sebastian Seung, received an Outstanding Computer Science Thesis award.
Background
Research interests: Understanding and improving how large language models (LLMs) learn from online feedback, particularly through the lens of training dynamics. Recently, also interested in the use of LLMs for biological applications.
Miscellany
Enjoys running and baking, volunteers as a NYSDOH-certified rape and domestic violence crisis counselor/victim advocate for the NYC Crime Victims Treatment Center at local hospital EDs.