Scholar
Xiaozhe Yao
Google Scholar ID: Bhgm1tQAAAAJ
ETH Zurich
Machine Learning Systems
Machine Learning
LLMs
Follow
Homepage
↗
Google Scholar
↗
Citations & Impact
All-time
Citations
446
H-index
8
i10-index
8
Publications
19
Co-authors
9
list available
Contact
Email
askxzyao@gmail.com
GitHub
Open ↗
LinkedIn
Open ↗
Publications
6 items
Taming the Long-Tail: Efficient Reasoning RL Training with Adaptive Drafter
2025
Cited
0
Mixtera: A Data Plane for Foundation Model Training
2025
Cited
0
ThunderServe: High-performance and Cost-efficient LLM Serving in Cloud Environments
2025
Cited
0
Demystifying Cost-Efficiency in LLM Serving over Heterogeneous GPUs
2025
Cited
0
Aurora-M: Open Source Continual Pre-training for Multilingual Language and Code
2024
Cited
5
DeltaZip: Efficient Serving of Multiple Full-Model-Tuned LLMs
2023
Cited
6
Resume (English only)
Academic Achievements
Involved in building Swiss AI Serving
Proposed DeltaZip for serving multiple full-model-tuned LLMs efficiently
Provided an information-theoretic perspective on why delta compression works
Contributed to the Vision Debugging Benchmark at DataPerf
Built a neural network from scratch
Co-authors
9 total
Ce Zhang
Together AI; University of Chicago
Ana Klimovic
ETH Zurich
Co-author 3
Binhang Yuan(袁彬航)
Hong Kong University of Science and Technology
Beidi Chen
Carnegie Mellon University
Sanmi Koyejo
Assistant Professor, Stanford University
Co-author 7
Cedric Renggli
Apple
×
Welcome back
Sign in to Agora
Welcome back! Please sign in to continue.
Email address
Password
Forgot password?
Continue
Do not have an account?
Sign up