Michael E. Sander
Scholar

Michael E. Sander

Google Scholar ID: COqAqcMAAAAJ
Google DeepMind
Machine LearningApplied Mathematics
Citations & Impact
All-time
Citations
897
 
H-index
8
 
i10-index
8
 
Publications
13
 
Co-authors
15
list available
Resume (English only)
Academic Achievements
  • Joint Learning of Energy-based Models and their Partition Function, ICML 2025
  • Loss Functions and Operators Generated by f-Divergences, ICML 2025
  • Towards Understanding the Universality of Transformers for Next-Token Prediction, ICLR 2025
  • PhD Manuscript: Deeper Learning: Residual Networks, Neural Differential Equations and Transformers, in Theory and Action
  • How do Transformers perform In-Context Autoregressive Learning?, ICML 2024
  • Implicit regularization of deep residual networks towards neural ODEs, ICLR 2024 (Spotlight)
  • Fast, Differentiable and Sparse Top-k: a Convex Analysis Perspective, ICML 2023
  • Do Residual Neural Networks discretize Neural Ordinary Differential Equations?, NeurIPS 2022
  • Vision Transformers provably learn spatial structure, NeurIPS 2022
  • Sinkformers: Transformers with Doubly Stochastic Attention, AISTATS 2022
  • Momentum Residual Neural Networks, ICML 2021