International Conference on Learning Representations · 2023
Cited
385
Resume (English only)
Academic Achievements
Paper 'Understanding Emergent Abilities of Language Models from the Loss Perspective' accepted at NeurIPS 2024.
Paper 'Scaling Speech-Text Pre-training with Synthetic Interleaved Data' accepted at ICLR 2025.
Contributed to the release of multiple GLM series large models, including GLM-4.5, GLM-4-Voice, GLM-4-9B, and GLM-130B.
Published papers at top-tier venues including ACL, NeurIPS, ICLR, SIGIR, KDD, TKDE, and ECML/PKDD.
Awards: Tsinghua Excellent Bachelor Graduate (top 2%), Cai Xiong Scholarship (top 1%, for outstanding research), Elite Collegiate Award from China Computer Federation (73 nationwide, only 4 from Tsinghua).
Research Experience
Tech Lead at ZhipuAI (Jan 2023–Present): Co-leading the pre-training team of ChatGLM.
Research Intern at Beijing Academy of Artificial Intelligence (Sep 2020–Mar 2022), advised by Prof. Jie Tang.
Research Intern at DAMO Academy, Alibaba Group (Jun 2020–Sep 2020), advised by Hongxia Yang.
Research Intern at Cornell University (Jun 2019–Oct 2019), advised by Prof. Thorsten Joachims.
Research Assistant at Knowledge Engineering Group, Tsinghua University (Jun 2017–Jun 2019), advised by Prof. Jie Tang.