- Paper 'Merging by Matching Models in Task Subspaces' published in TMLR March 2024
- Paper 'TIES-Merging: Resolving Interference When Merging Models' presented at NeurIPS 2023
- Paper 'Simple Weakly-Supervised Image Captioning via CLIP's Multimodal Embeddings' presented at AAAI Workshop on Creative AI Across Modalities, 2023
- Paper 'Evaluating the Factual Consistency of Large Language Models Through News Summarization' presented at ACL 2023
- Paper 'Few-Shot Parameter-Efficient Fine-Tuning is Better and Cheaper than In-Context Learning' presented at NeurIPS 2022
- Paper 'An Empirical Survey of Data Augmentation for Limited Data Learning in NLP' published in TACL 2022
- Paper 'Isochrony-Aware Neural Machine Translation for Automatic Dubbing' presented at Interspeech 2022
- Paper 'Improving and Simplifying Pattern Exploiting Training' presented at EMNLP 2021 (Short)
- Paper 'Predicting Institution Hierarchies with Set-based Models' presented at AKBC 2020
- Paper 'Optimal Transport-based Alignment of Learned Character Representations for String Similarity' presented at ACL 2019 (Oral)
Research Experience
- Conducted doctoral research at the University of Toronto, focusing on model merging and NLP
- Worked at UNC Chapel Hill for the first three years, collaborating with Mohit Bansal
Education
- Ph.D.: University of Toronto, advisor Colin Raffel
- M.S.: University of Massachusetts Amherst, advisor Andrew McCallum
- B.S.: Carnegie Mellon University, major in Computer Science and Statistics
Background
Ph.D. student at the University of Toronto, advised by Colin Raffel. Research interests include machine learning and NLP, particularly model merging and understanding why different merging methods behave the way they do. Previously worked on parameter-efficient fine-tuning and few-shot learning.
Miscellany
Outside of research, also a member of Friendship Baptist Church