Jingwei Zuo
Scholar

Jingwei Zuo

Google Scholar ID: ClRmmOEAAAAJ
Technology Innovation Institute (TII)
Large Language ModelsMachine LearningData Mining
Citations & Impact
All-time
Citations
292
 
H-index
9
 
i10-index
9
 
Publications
20
 
Co-authors
2
list available
Resume (English only)
Academic Achievements
  • Led the release of the Falcon-H1 series (May 2025), featuring a novel hybrid Transformer–SSM architecture and achieving state-of-the-art performance across multiple scales (0.5B to 34B).
  • Released Falcon-Edge (May 2025), a series of powerful, universal, fine-tunable 1.58-bit language models.
  • Launched Falcon 3 (December 2024), including base and Instruct models from 1B to 10B, plus an advanced 7B Mamba variant.
  • Released Falcon Mamba 7B (August 2024), the first strong attention-free 7B language model, with a technical report published in October 2024.
  • Published key preprints on arXiv, including 'Falcon-H1' (arXiv’25) and 'Falcon Mamba' (arXiv’24).
  • Co-organized the NeurIPS 2025 E2LM Competition (June 2025) on early-stage training evaluation of LLMs.
  • Served on program committees for ECAI 2024, CIKM 2023/2024, ECML-PKDD 2020, and as a reviewer for journals such as TKDE, Information Sciences, etc.
  • Previously served as Web Chair for MDM 2020 and on the junior organizing committee for JDSE 2019.