Published multiple papers such as 'From Dormant to Deleted: Tamper-Resistant Unlearning Through Weight-Space Regularization' (NeurIPS 2025), 'Improved Localized Machine Unlearning Through the Lens of Memorization' (TMLR 2025), etc.; participated in organizing and speaking at several key conferences and workshops.
Research Experience
Worked as a Research Scientist at Google Brain; co-organized the NeurIPS 2022 workshop on meta-learning; gave an invited talk at the NeurIPS 2021 workshop on meta-learning; led a tutorial at NeurIPS 2022 about the role of meta-learning in few-shot learning; co-led the organization of the first NeurIPS competition on unlearning at NeurIPS'23; gave a keynote at CoLLAs'24 on machine unlearning; presented a tutorial on unlearning at CVPR'24.
Education
PhD from the University of Toronto (2021), advised by Raquel Urtasun and Richard Zemel.
Background
She is a senior research scientist at Google DeepMind, based in London UK. Her main research interest is around creating methods that allow efficient and effective adaptation of deep neural networks to cope with distribution shifts, rapidly learning new tasks, and supporting efficient unlearning of data points. Her research falls in the areas of few-shot learning, meta-learning, domain adaptation, and machine unlearning.
Miscellany
In her free time, she enjoys food blogging, dance (especially hip hop and commercial), piano playing, singing, and song writing.