Aditya Tomar
Scholar

Aditya Tomar

Google Scholar ID: hda-kVkAAAAJ
UC Berkeley
Efficient Deep LearningMachine Learning SystemsHigh Performance Computing
Citations & Impact
All-time
Citations
14
 
H-index
2
 
i10-index
1
 
Publications
7
 
Co-authors
5
list available
Publications
1 items
Resume (English only)
Academic Achievements
  • Publications: XQuant: Breaking the Memory Wall for LLM Inference with KV Cache Rematerialization, QuantSpec: Self-Speculative Decoding with Hierarchical Quantized KV Cache, Democratizing AI: Open-source Scalable LLM Training on GPU-based Supercomputers, Eve: Less Memory, Same Might, Beyond Next-Token Prediction: A Performance Characterization of Diffusion versus Autoregressive Language Models, Can Transformers Break Encryption Schemes via In-Context Learning?, Automated Programmatic Performance Analysis of Parallel Programs; Awards: 2025 EECS Evergreen Undergraduate Research Award, 2024 ACM Gordon Bell Prize Finalist, 2024 ACM SIGHPC Travel Grant Award, 2024 TCHPC/TCPP Travel Grant Award, 2023 Regents’ and Chancellor’s Scholarship, 2023 Leadership Award, 2022 National Merit Scholarship Winner, 2021 Stack Overflow Top 0.01% Contributor
Research Experience
  • Currently working at BAIR under Prof. Kurt Keutzer; previously worked at PSSG under Prof. Abhinav Bhatele on memory-efficient adaptive optimization algorithms and efficient parallel training methods for LLMs.
Education
  • UC Berkeley, Undergraduate in EECS
Background
  • A 3rd year undergraduate at UC Berkeley studying EECS. Research interests broadly lie in machine learning systems, with a focus on efficient inference.
Miscellany
  • No personal interests mentioned