Scholar
Ajay Jaiswal
Google Scholar ID: I783HxYAAAAJ
RS@Apple | Amazon Ph.D Fellow | UT Austin | IIT-KGP
Model Compression
Pruning
LLMs
Efficient Inference
Follow
Homepage
↗
Google Scholar
↗
Citations & Impact
All-time
Citations
1,272
H-index
22
i10-index
30
Publications
20
Co-authors
19
list available
Contact
No contact links provided.
Publications
6 items
SpecMD: A Comprehensive Study On Speculative Expert Prefetching
2026
Cited
0
MemoryLLM: Plug-n-Play Interpretable Feed-Forward Memory for Transformers
2026
Cited
0
AlphaDecay:Module-wise Weight Decay for Heavy-Tailed Balancing in LLMs
2025
Cited
0
Finding Fantastic Experts in MoEs: A Unified Study for Expert Dropping Strategies and Observations
2025
Cited
0
IDEA Prune: An Integrated Enlarge-and-Prune Pipeline in Generative Language Model Pretraining
2025
Cited
0
Sebra: Debiasing Through Self-Guided Bias Ranking
2025
Cited
0
Resume (English only)
Co-authors
19 total
Zhangyang (Atlas) Wang
XTX Markets & University of Texas at Austin
Shiwei Liu
ELLIS Institute Tübingen, MPI for Intelligent Systems, University of Oxford, CS@TU/e
Ying Ding
Bill & Lewis Suit Professor, School of Information, Dell Med, University of Texas at Austin
Tianlong Chen
Assistant Professor, CS@UNC Chapel Hill; Chief AI Scientist, hireEZ
Lu Yin
Asst. Professor, CS@University of Surrey & Researcher Fellow, CS@TU/e
Yifan Peng
Associate Professor at Weill Cornell Medicine
Justin F Rousseau
Associate Professor of Neurology, University of Texas Southwestern Medical Center
Co-author 8
×
Welcome back
Sign in to Agora
Welcome back! Please sign in to continue.
Email address
Password
Forgot password?
Continue
Do not have an account?
Sign up