Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
Paper 'ProTrix: Building Models for Planning and Reasoning over Tables with Sentence Context' accepted by EMNLP 2024 Findings; Paper 'Haste Makes Waste: Evaluating Planning Abilities of LLMs for Efficient and Feasible Multitasking with Time Constraints Between Actions' accepted by EMNLP 2025 Findings; Paper 'DreamOn: Diffusion Language Models For Code Infilling Beyond Fixed-size Canvas' accepted by EMNLP 2025 Findings; Paper 'Are LLMs capable of data-based statistical and causal reasoning? Benchmarking advanced quantitative reasoning with data' accepted by ACL Findings 2024; Released diffusion language models Dream 7B, Dream-Coder, and DreamOn; Benchmark Recipe2Plan accepted by EMNLP 2025 Findings.
Research Experience
Research intern at Klear Team, Kuaishou Technology; Collaborated with Prof. Yansong Feng and Prof. Lingpeng Kong to develop the Dream Series open-source diffusion language models.
Education
Bachelor's degree in Integrated Science Program from Yuanpei College, Peking University; Master’s student at Wangxuan Institute of Computer Technology, Peking University, advised by Prof. Yansong Feng. Also worked with Prof. Chao Tang on metabolic flux analysis and single-cell foundation models.
Background
Research interests include diffusion language models, code generation, etc. Served as a research intern at Klear Team, Kuaishou Technology, and worked closely with Prof. Yansong Feng and Prof. Lingpeng Kong on developing the Dream Series open-source diffusion language models.