Published papers on reducing hallucinations in large language models, including methods like DoLa, Lookback Lens, and SelfCite; developed DiffCSE for better sentence embeddings and Query Reranking for more accurate passage retrieval; contributed to the MetaCLIP 2 project, which will be presented at NeurIPS 2025.
Research Experience
Currently a PhD student at MIT CSAIL, advised by Prof. Jim Glass. Interned at Meta FAIR in summer 2025, working with Luke Zettlemoyer; interned at Meta FAIR again in summer 2024, collaborating with Hu Xu, Daniel Li, Scott Yih; had an internship at Microsoft in summer 2023, alongside Pengcheng He and Yujia Xie; and interned at MIT-IBM Watson AI Lab in summer 2022, with Yang Zhang, Shiyu Chang, Yoon Kim, and Kaizhi Qian as collaborators.
Education
PhD student at MIT CSAIL, working with Prof. Jim Glass; obtained B.S. in Electrical Engineering from National Taiwan University in 2020, where he conducted research in speech processing and NLP under the guidance of Hung-Yi Lee, Yun-Nung Chen, and Lin-shan Lee.
Background
Research interests include large language models: hallucinations, factuality, and retrieval-augmented generation. During his internship at Meta FAIR, he worked on the pre-training of a multilingual vision-language model called MetaCLIP 2.
Miscellany
Personal interests and other related information not explicitly mentioned.