Ambroise Odonnat
Scholar

Ambroise Odonnat

Google Scholar ID: M_OS-3kAAAAJ
Ph.D. Student - Noah's Ark Lab & Inria
Deep LearningMachine LearningVision TransformersLLMsDistribution Shifts
Citations & Impact
All-time
Citations
112
 
H-index
5
 
i10-index
3
 
Publications
12
 
Co-authors
34
list available
Resume (English only)
Academic Achievements
  • Received ICML Oral Award, ICASSP Oral Award, and QBIN Best Flash Talk Award for his research; one of his recent articles was featured in Forbes; multiple publications accepted or preprinted, including 'Easing Optimization Paths: A Circuit Perspective' accepted at ICASSP 2025, 'Clustering Heads' as a preprint, and 'Large Language Models as Markov Chains' also featured in Forbes.
Research Experience
  • Currently a Ph.D. student at Huawei Noah’s Ark Lab & Inria in Paris; presented research at leading institutions such as EPFL, ENS Ulm, and Criteo; contributed to open-source libraries.
Education
  • Graduated from Ecole des Ponts ParisTech in 2023 and holds a master’s degree from ENS Paris-Saclay in Mathematics, Vision, and Machine Learning (MVA). Supervised by Romain Tavenard, Laetitia Chapel, and Ievgen Redko.
Background
  • Interested in improving the core understanding of Transformers, particularly large language models, out-of-distribution generalization, Transformer training and fine-tuning, Vision Transformers, and Time Series forecasting.
Miscellany
  • Enjoys working both with a few collaborators and as part of a larger team; maintains a research blog named logB.