Published several papers, such as 'Minicpm: Unveiling the potential of small language models with scalable training strategies' (COLM 2024, Spotlight); 'SELF-GUIDE: Better Task-Specific Instruction Following via Self-Synthetic Finetuning' (COLM 2024); 'Prompt2Model: Generating Deployable Models from Natural Language Instructions' (EMNLP 2023, System Demonstration).
Research Experience
During her PhD studies at UCLA, she has been involved in multiple research projects, including work on post-training pipelines and machine learning systems.
Education
Received a Bachelor of Engineer Degree from the Department of Computer Science and Technology at Tsinghua University (THU), where she worked on LLM Alignment and Data Synthesis. During her undergraduate years, she was supervised by Prof. Graham Neubig and Prof. Sherry Tongshuang Wu, and mentored by Vijay Viswanathan at CMU.
Background
Research interests include post-training pipelines and machine learning systems. Currently a CS PhD student at UCLA, advised by Prof. Quanquan Gu, and a member of lmsys.org, working closely with Prof. Ying Sheng and Prof. Lianmin Zheng. Leads the RL Group in the SGLang team, facilitating RLHF systems.
Miscellany
Hobbies include playing steel-string acoustic guitar, swimming, running, hiking, and cooking Szechwan cuisine. Her mottos include: ACL is not an AI conference; I really want to make progress, teacher; The bigger the waves, the more valuable the fish; Talk is cheap. Show me the code; A life without impact is like not living at all; A good 985 is no worse than a junior college; Life gave me a punch, but I played rock; The only thing that can make me look down is the sun in Southern California.