Published multiple papers in top international conferences such as COLM'25, ICML'25, NeurIPS'25, ICLR'25, ACL'25, NAACL'25, EMNLP'25, NAACL'24, ICLR'24, and ICML'24.
Research Experience
Interned at several prominent groups including Google DeepMind with Pete Shaw, Kenton Lee, and Mandar Joshi (Summer 2025); FAIR (Meta) with Jason Weston and Maryam Fazel-Zarandi (Summer 2024); Allen Institute for AI (AI2) with Tushar Khot, Ashish Sabharwal, and Peter Clark (Summer 2023); and Adobe Research in 2022.
Education
PhD: Department of Computer Science, University of North Carolina at Chapel Hill, Advisor: Mohit Bansal; Supported by the Apple Scholars in AI/ML PhD fellowship.
Background
Research interests: Natural Language Processing and Machine Learning, with a focus on developing methods to evaluate and strengthen reasoning in Large Language Models (LLMs). Aims to enable LLMs to identify and rectify issues in its reasoning, improve alignment, and enhance the understanding of the reasoning process. Also explores practical applications of LLM reasoning in domains such as planning and coding.
Miscellany
Currently on the job market, looking for industry Research Scientist positions in Summer/Fall 2026.