Michael Crawshaw
Scholar

Michael Crawshaw

Google Scholar ID: XVrMZ_4AAAAJ
George Mason University
Machine learningoptimizationdeep learningfederated learning
Citations & Impact
All-time
Citations
1,233
 
H-index
5
 
i10-index
5
 
Publications
11
 
Co-authors
0
 
Resume (English only)
Academic Achievements
  • Publications: International Conference on Machine Learning, International Conference on Learning Representations, Neural Information Processing Systems, and more; Reviewer Service: 2026: AAAI, ICLR; 2025: ICLR (Best Reviewer), AISTATS, ICML (Top Reviewer), NeurIPS (Top Reviewer); 2024: AISTATS, ICML, NeurIPS; 2023: NeurIPS (Top Reviewer, top 8%).
Research Experience
  • Summer 2025 intern at the Flatiron Institute's Center for Computational Mathematics, working with Robert Gower. Formalized many variations of the Muon optimizer as a type of non-Euclidean gradient descent, significantly improving robustness to learning rate tuning through model truncation.
Education
  • Ph.D.: Department of Computer Science, George Mason University, Advisor: Professor Mingrui Liu; B.S.: Mathematics and Computer Science, Ohio State University.
Background
  • Research Interests: Theory of optimization for machine learning; Areas of Expertise: Distributed optimization/federated learning, Optimization under non-standard assumptions (e.g., relaxed smoothness), and Optimization with large step sizes/Edge of Stability; Background: Ph.D. student in the Department of Computer Science at George Mason University, advised by Professor Mingrui Liu.
Miscellany
  • In spare time, enjoys developing open-source projects as tools for life, including a simple arXiv paper recommender and a program to generate lessons for language learning by converting text files into.
Co-authors
0 total
Co-authors: 0 (list not available)