Yuanhao Yue
Scholar

Yuanhao Yue

Google Scholar ID: HKy62pwAAAAJ
Fudan University
LLMNLPInstruction TuningData SynthesisFactuality
Citations & Impact
All-time
Citations
441
 
H-index
4
 
i10-index
3
 
Publications
7
 
Co-authors
3
list available
Resume (English only)
Academic Achievements
  • Paper 'Distilling Instruction-following Abilities of Large Language Models with Task-aware Curriculum Planning' published in EMNLP 2024 Findings.
  • Paper 'Building a Family of Data Augmentation Models for Low-cost LLM Fine-tuning on the Cloud' accepted at COLING 2025 (Oral).
  • Paper 'DistilQwen2.5: Industrial Practices of Training Distilled Open Lightweight Language Models' accepted at ACL 2025.
  • Preprint 'EasyDistill: A Comprehensive Toolkit for Effective Knowledge Distillation of Large Language Models' on arXiv.
  • Survey paper 'Survey on Factuality in Large Language Models: Knowledge, Retrieval and Domain-Specificity' published in ACM Computing Surveys.
  • Paper 'Evaluating Open-QA evaluation' published in NeurIPS 2023.
  • Contributed to SuperGPQA: a large-scale LLM evaluation benchmark covering 285 graduate disciplines, released on arXiv.
  • Reviewer for ACL, EMNLP, and NeurIPS.
  • Outstanding Graduate of Fudan University (2025).
  • Honorable Mention in The Mathematical Contest in Modeling (2021).
  • Outstanding Debater in the School's Debate Competition (2018–2022).
  • Top 5% in AMC 12 (2017).