Worked with Annie En-Shiun Lee at the University of Toronto on language modeling, multilingual NLP, efficient and robust NLP, and multimodal LMs. Conducted research with Maryam Mehri Dehnavi to explore techniques for more efficient pre-training and inference, focusing on model compression on LLMs.
Education
Master of Computer Science at Stanford University (Advisors: Diyi Yang and Jiaxin Pei); Bachelor's degree in Computer Science from the University of Toronto (Focus: Machine Learning, Statistics, and Chemistry, Advisors: Annie En-Shiun Lee and Maryam Mehri Dehnavi)
Background
Primary research interests lie in the areas of LMs, multilingual NLP, multimodal LMs, and evaluation for generic generation tasks. Also interested in exploring other areas such as agentic systems and statistical approaches to develop robust and generalizable AI systems. Additionally, involved in data-driven decision-making in various research areas, including parallel and distributed databases, as well as drug discovery using machine learning.
Miscellany
In free time, enjoys playing guitar and singing, exploring new foods, and watching professional eSports (mostly competitive FPS games).