- Publication: Decentralized Asynchronous Optimization with DADAO allows Decoupling and Acceleration, JMLR 2025
- Preprints/Technical Reports: LO: Compute-Efficient Meta-Generalization of Learned Optimizers, Model Parallelism With Subnetwork Data Parallelism, etc.
Worked at Flatiron Institute (CCM), Ecole Polytechnique (DepMap), CentraleSupélec (Opis), INRIA Lille (SequeL) with Michal Valko, Ecole Normale Supérieure (DATA) with Stéphane Mallat, and ENS Cachan, campus de Ker Lann.
Background
A CNRS researcher in the MLIA team at Sorbonne University. Research interests include the foundations of machine learning techniques, symmetries of deep neural networks, and developing algorithms for large-scale distributed and decentralized training.
Miscellany
No specific personal interests mentioned; Contact: edouard.oyallon@cnrs.fr; Looking for highly motivated students/colleagues passionate about optimizing and accelerating the training of massive LLMs.