Guangji Bai
Scholar

Guangji Bai

Google Scholar ID: gBMbU28AAAAJ
Applied Scientist, Amazon
Machine LearningLLM EfficiencyModel Pruning
Citations & Impact
All-time
Citations
494
 
H-index
12
 
i10-index
12
 
Publications
20
 
Co-authors
11
list available
Resume (English only)
Academic Achievements
  • - Published papers:
  • - SparseLLM: Towards Global Pruning for Pre-trained Language Models (NeurIPS 2024)
  • - FedSpaLLM: Federated Pruning of Large Language Models (NAACL 2025 main)
  • - Beyond Efficiency: A Systematic Survey of Resource-Efficient Large Language Models (Under review of CSUR)
  • - Staleness-Alleviated Distributed GNN Training via Online Dynamic-Embedding Prediction (SDM 2025)
  • - Continuous Temporal Domain Generalization (NeurIPS 2024)
  • - Temporal Domain Generalization with Drift-Aware Dynamic Neural Networks (ICLR 2023 Oral)
  • - Sign-Regularized Multi-Task Learning (SDM 2023)
  • - Saliency-Regularized Deep Multi-Task Learning (KDD 2022)
  • - Saliency-Augmented Memory Completion for Continual Learning (SDM 2023)
  • - PC member for KDD, ICML, ICLR, AISTATS, NeurIPS, AAAI, ICDM, etc.
  • - Primary writer for the NSF NAIRR 240189 grant ($15k) on parallel and distributed training of LLMs on graphs.
  • - Student travel award for KDD 22’, ICLR 23’, SDM 23’, CIKM 23’, NeurIPS 24’
Research Experience
  • - Applied Scientist at Amazon.com, working on the shopping recommendation system
  • - Research intern at Argonne National Laboratory and NEC Lab America
Education
  • - Ph.D. in Computer Science from Emory University, advised by Dr. Liang Zhao
  • - Master's degree in Statistics from George Washington University, 2020
  • - Bachelor's degree in Mathematics from Fudan University, 2018
Background
  • Applied Scientist at Amazon.com, focusing on building the next-gen shopping recommendation system, with a focus on customer intent understanding and modeling. Research interests include efficient large-scale machine learning, domain and knowledge transfer, and neuro-inspired continual learning.
Miscellany
  • Personal interests not mentioned