Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
Awarded an ARC Discovery Project grant on diffusion models in October 2025; TMLR paper received a J2C certification in October 2025; Two papers accepted to NeurIPS in September 2025; Preprint: Sparse GPs: Structured approximations and Power-EP revisited with Michalis Titsias in July 2025; Preprint: Rao-Blackwellised Reparam Gradients, led by Kevin Lam, with George Deligiannidis and Yee Whye Teh in June 2025; TMLR paper with Matt Ashman and Rich Turner in May 2025; New preprint on tighter sparse variational GP approximations in February 2025; Attended a Dagstuhl workshop in November 2024; Two preprints on semantic entropy in LLMs and likelihood approximations in October 2024.
Research Experience
Lecturer at the University of Sydney from 2018 to 2022; spent two years (2019-2020) at Uber AI; Lecturer at the School of Computing, Australian National University since July 2022.
Education
Completed doctoral training at the Cambridge Machine Learning group, supervised by Richard Turner and advised by Carl Rasmussen.
Background
Lecturer in Machine Learning at the School of Computing, Australian National University, with research interests including probabilistic modelling and inference, Monte Carlo and approximate inference methods, distributed, active, and continual learning, and model-based reinforcement learning.
Miscellany
Looking for PhD/MPhil students and postdoctoral fellows.