International Conference on Learning Representations · 2024
Cited
1
Resume (English only)
Academic Achievements
Publications: Paper on 'Disentangled Representational Learning using the Gromov Wasserstein paradigm' accepted to ICLR 2025; Two papers accepted at NeurIPS 2024: 'GENOT: Entropic (Gromov) Wasserstein Flow Matching' and 'Mirror and Preconditioned Gradient Descent in Wasserstein Space', with the latter receiving a spotlight!
Research Experience
Since February 2024: Visiting Ph.D. student in Fabian Theis' lab at Helmholtz Munich, working on generative models for predicting cellular responses to perturbations; June to August 2024: Interned at the Flatiron Institute in Michael Shelley's team, developing multi-marginal generative models for population dynamics; Since December 2024: Interning in the fundamental research team of Amazon led by Stefano Soatto, working on large language models for reasoning.
Education
Ph.D.: Working with Marco Cuturi at CREST-ENSAE, Institut Polytechnique de Paris; Master's Degree: Completed a Mathematics, Vision and Learning (MVA) program at ENS Paris-Saclay. During his master's thesis, he worked with Claire Boyer (Sorbonne Université), Julie Josse (INRIA), and Boris Muzellec (Owkin) at LPSM.
Background
Research Interests: Optimal Transport (OT), Generative Modeling, and Representational Learning. Professional Field: Machine Learning. Brief Introduction: Aims to introduce optimal transport into flow-, diffusion-, and VAE-based models to enhance their performance.