Guangxuan Xiao
Scholar

Guangxuan Xiao

Google Scholar ID: sRGO-EcAAAAJ
Ph.D. candidate, MIT
Deep LearningMachine Learning
Citations & Impact
All-time
Citations
5,061
 
H-index
14
 
i10-index
15
 
Publications
19
 
Co-authors
6
list available
Resume (English only)
Academic Achievements
  • Papers: XAttention: Block Sparse Attention with Antidiagonal Scoring (ICML 2025); DuoAttention: Efficient Long-Context LLM Inference with Retrieval and Streaming Heads (ICLR 2025); Efficient Streaming Language Models with Attention Sinks (ICLR 2024); SmoothQuant: Accurate and Efficient Post-Training Quantization for Large Language Models (ICML 2023); FastComposer: Tuning-Free Multi-Subject Image Generation with Localized Attention (IJCV 2024).
Research Experience
  • Ph.D. Candidate at Massachusetts Institute of Technology (2022.08 - Present); Visiting Research Student at Stanford University (2020.07 - 2021.06, 2021.06 - 2021.11); Research Intern at NVIDIA (2024 - 2025); Research Scientist at Meta Inc. (2023).
Education
  • Ph.D. Candidate in EECS at MIT, advised by Prof. Song Han; S.M. in Computer Science at MIT; B.Eng. in Computer Science and B.Econ. in Economics (Second Major) at Tsinghua University, advised by Prof. Zhiyuan Liu; Visiting Research Student at Stanford University, advised by Prof. Jure Leskovec, Prof. Jiajun Wu, and Prof. Leslie Pack Kaelbling.
Background
  • Research Interests: efficient algorithms and systems for deep learning, particularly large foundation models. Background: B.Eng. in Computer Science and B.Econ. in Finance from Tsinghua University, and a visiting student researcher at Stanford University.
Miscellany
  • Personal interests not provided.