Yi Hu
Scholar

Yi Hu

Google Scholar ID: VnuV_T0AAAAJ
Peking University
Machine learningLarge Language Model
Citations & Impact
All-time
Citations
128
 
H-index
5
 
i10-index
5
 
Publications
8
 
Co-authors
7
list available
Resume (English only)
Academic Achievements
  • Published two papers at NeurIPS 2025: 'Meta-RFFT' and 'PHYBench', and one paper at NeurIPS Mech Interp Workshop: 'What Affects the Effective Depth of Large Language Models?'. Published a paper at ICLR 2025: 'Number Cookbook: Number Understanding of Language Models and How to Improve It'. Published a paper at ICML 2024: 'Case-Based or Rule-Based: How Do Transformers Do the Math?'
Research Experience
  • Collaborated with the School of Physics, Peking University to release PHYBench, a physical reasoning benchmark for modern LLMs. Involved in multiple research projects such as Meta-RFFT, PHYBench, etc.
Education
  • Bachelor of Science in Physics from Peking University, School of Physics, advised by Prof. Huichao Song; currently a first-year Ph.D. student at the Institute for Artificial Intelligence, Peking University, advised by Prof. Muhan Zhang.
Background
  • Research interests: reasoning mechanisms of large language models (LLMs), enhancing the models' reasoning capabilities to the level of human experts. Also interested in various topics related to LLMs, including efficiency, alignment, and applications across different downstream domains.
Miscellany
  • Personal website is under construction.