Selected Publications: 'Rectified Diffusion: Straightness Is Not Your Need in Rectified Flow', ICLR 2025; 'Phased Consistency Model', NeurIPS 2024.
Research Experience
Tencent AI Lab: Research Intern (2022.6 - 2022.12), worked on Class-Incremental Learning, supervised by Dr. Liu Liu, collaborated with Dr. Yatao Bian; Avolution AI (acquired by MiniMax): Research Collaboration (2023.10 - 2024.10), worked on Video Diffusion Models and Diffusion Distillation, supervised by Dr. Zhaoyang Huang, collaborated with Dr. Xiaoyu Shi and Weikang Bian; Google DeepMind: Research Intern (2025.2 - 2025.5), focused on Diffusion Distillation and Reinforcement Learning, supervised by Dr. Long Zhao, Dr. Ting Liu, Dr. Hao Zhou, and Dr. LiangZhe Yuan, collaborated with Prof. Bohyung Han, Prof. Boqing Gong, Prof. Ming-Hsuan Yang, and Dr. Yukun Zhu; Reve Art: Research Intern (2025.6 - Present), focused on Multimodal Language Models, Diffusion Models, and Reinforcement Learning, supervised by Dr. Han Zhang.
Education
Ph.D. in Engineering, The Chinese University of Hong Kong (CUHK) (2023 - Present), Supervisors: Professor Hongsheng Li and Professor Xiaogang Wang; B.Eng. in Artificial Intelligence, Nanjing University (2019 - 2023, Rank 2/88), Supervisors: Professor Han-Jia Ye and Professor Da-Wei Zhou (LAMDA Group).
Background
Research interests: Scalable post-training techniques for diffusion models and unified multimodal models. Plans to enter the job market in 2027, open to industrial generative AI jobs and postdoctoral roles.