- Preprints: 'Planner Aware Path Learning in Diffusion Language Models Training' and two more
- Papers: 'VersaPRM: Multi-Domain Process Reward Model via Synthetic Reasoning Data' and four more
- Projects: Open-dCoder: The First Fully Open Diffusion LLM for Code
Research Experience
- Research Intern: Krafton AI, Madison, July 2025 - Sep 2025, Advisor: Jaewoong Cho
- Research Assistant: Shanghai AI Laboratory, Shanghai, May 2024 - Aug 2024, Advisor: Wenqi Shao
- Research Assistant: WestlakeNLP Lab, Westlake University, Hangzhou, Jun 2023 - Apr 2024, Advisor: Yue Zhang
- Teaching Assistant: COMP SCI 220 Data Science Programming I, University of Wisconsin-Madison, Spring 2025; COMP SCI 300 Programming II, University of Wisconsin-Madison, Fall 2024
Education
- Degree: Dual Bachelor's degrees, Ph.D. in progress
- Schools: University of Electronic Science and Technology of China (UESTC), University of Glasgow, University of Wisconsin-Madison
- Advisors: Not provided
- Time: Currently a second-year Ph.D. student
- Major: Computer Science
Background
- Research Interests: Post-training stages of large foundation models, particularly diffusion language models
- Field: Computer Science
- Bio: A second-year Ph.D. student in Computer Science at the University of Wisconsin-Madison, focusing on improving the efficiency and intelligence of large foundation models and particularly interested in diffusion language models.