Yequan Wang 王业全
Scholar

Yequan Wang 王业全

Google Scholar ID: 7Gqp6FsAAAAJ
Beijing Academy of Artificial Intelligence
Large ModelEmboddied AGINLP
Citations & Impact
All-time
Citations
3,445
 
H-index
18
 
i10-index
25
 
Publications
20
 
Co-authors
22
list available
Publications
20 items
Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
  • Honorable Mention, AI 2000 Most Influential Scholars in NLP (2022)
  • Google Scholar citations: 4,000+
  • Key publications include:
  • - 'Not All Layers of LLMs Are Necessary During Inference': Proposed AdaInfer, reducing inference computation by up to 43% with <1% performance loss
  • - 'Few-Shot Learner Generalizes Across AI-Generated Image Detection': Introduced FSD, achieving +11.6% accuracy without retraining
  • - '52B to 1T: Lessons Learned via Tele-FLM Series': Insights on scaling large language models
  • PI of NSFC project 'Implicit Sentiment Analysis on Complicated Web Text' (Grant No. 62106249)
Research Experience
  • Currently a Researcher and Team Leader at BAAI and PKU
  • Principal Investigator (PI) of the National Key R&D Program 'Next-Generation Artificial Intelligence' (Nov 2022 – Nov 2025)
  • Leads development of the FLM family of large models (FLM-2, FLM-101B, FreeLM), featuring growth techniques, loss prediction, and the FreeLM framework
  • Developed and open-sourced the Arabic Language Model (ALM 1.0)
  • Involved in the national project 'Artificial Intelligence Fundamental Model Support Platform and Evaluation Technology' to build an open-source ecosystem for foundational models