Pingzhi Li
Scholar

Pingzhi Li

Google Scholar ID: QUfhEyQAAAAJ
Ph.D. student @UNC-Chapel Hill
Deep Learning
Citations & Impact
All-time
Citations
188
 
H-index
5
 
i10-index
3
 
Publications
20
 
Co-authors
20
list available
Resume (English only)
Academic Achievements
  • 1. Mozart: Modularized and Efficient MoE Training on 3.5D Wafer-Scale Chiplet Architectures (NeurIPS 2025, Spotlight)
  • 2. Occult: Optimizing Collaborative Communication across Experts for Accelerated Parallel MoE Training and Inference (ICML 2025)
  • 3. Advancing MoE Efficiency: A Collaboration-Constrained Routing (C2R) Strategy for Better Expert Parallelism Design (NAACL 2025, SAC Award)
  • 4. Model-GLUE: Democratized LLM Scaling for A Large Model Zoo in the Wild (NeurIPS 2024)
  • 5. Revisiting Zeroth-Order Optimization for Memory-Efficient LLM Fine-Tuning: A Benchmark (ICML 2024)
  • 6. Merge, Then Compress: Demystify Efficient SMoE with Hints from Its Routing Policy (ICLR 2024, Spotlight)
  • Awards:
  • - 2025 NAACL Low Resource Methods for NLP SAC Award
  • - 1st Place of ACM/IEEE Quantum Computing for Drug Discovery Challenge 2023
  • - Outstanding Graduates Scholarship, USTC 2023
  • - Silver Medal in Kaggle Feedback Prize - Evaluating Student Writing 2022
  • - Outstanding Student Scholarship, USTC 2020/21/22
Research Experience
  • Currently a part-time research intern at Apple AI/ML, working with the Foundation Model team. Previously worked closely with Dr. Hanrui Wang.
Education
  • Ph.D. in Computer Science at UNC-Chapel Hill, advised by Prof. Tianlong Chen; B.E. in CS from USTC.
Background
  • Research interests: efficient AI computing, ML for science and vice versa. Enjoys simple and useful things, building systems, and turning thoughts into reality.
Miscellany
  • Personal interests include browsing Twitter, Last.fm, Instagram, and writing blogs.