Researched language control for physics-based character animation at the NVIDIA Toronto AI Lab, advised by Jason Peng and Sanja Fidler.
Interned twice at Facebook: once on data pipelines for machine translation, and once on neural image compression, helping create an open-source library for compression research.
Worked on sparse training algorithms for transformer-based language models at Cerebras.
At Groq, helped build a prototype compiler that converts computational graphs into instructions for the company's ASIC.
Before undergrad, spent several summers as a biochemistry student researcher in the Kay Lab at the University of Toronto, studying the ClpP protease.