Yimeng Wu
Scholar

Yimeng Wu

Google Scholar ID: TrTASWoAAAAJ
Huawei Noah's Ark Lab
Large Language Models
Citations & Impact
All-time
Citations
333
 
H-index
8
 
i10-index
8
 
Publications
12
 
Co-authors
0
 
Resume (English only)
Academic Achievements
  • Efficient Citer: Tuning Large Language Models for Enhanced Answer Quality and Verification (NAACL 2024 Findings)
  • AraMUS: Pushing the Limits of Data and Model Scale for Arabic Natural Language Processing (ACL 2023 Findings)
  • Revisiting Pre-trained Language Models and their Evaluation for Arabic Natural Language Processing (EMNLP 2022)
  • Universal-KD: Attention-based output-grounded intermediate layer knowledge distillation (EMNLP 2021)
  • Alp-kd: Attention-based layer projection for knowledge distillation (AAAI 2021)
  • JABER and SABER: Junior and Senior Arabic BERT (ArXiv 2021)
  • Why skip if you can combine: A simple knowledge distillation technique for intermediate layers (EMNLP 2020)
Research Experience
  • Serves as an NLP researcher at Huawei Noah’s Ark Lab, Hong Kong, focusing on building pre-trained models and conducting research related to model compression. Worked at Huawei Canada Research Centre for 3 years prior to this role.
Education
  • Completed Master’s degree in Electrical and Computer Engineering from McGill University in 2019, under the supervision of Dr. Ioannis Psaromiligkos. Received Bachelor’s degree in Biomedical Engineering from Tianjin University in 2017.
Background
  • Currently an NLP researcher at Huawei Noah’s Ark Lab in Hong Kong SAR. Previously worked at Huawei Canada Research Centre for 3 years. Research interests include LLM pretraining, knowledge distillation, and machine translation.
Miscellany
  • Looking for research interns interested in LLMs, especially MoE pretraining.
Co-authors
0 total
Co-authors: 0 (list not available)