Paper 'DriftLite: Lightweight Drift Control for Inference-Time Scaling of Diffusion Models' is under review and presented at Molecular Machine Learning Conference (MoML) 2025. Another paper 'A Unified Approach to Analysis and Design of Denoising Markov Models' is also under review.
Research Experience
Summer 2025: Visiting Researcher at Flatiron Institute, Simons Foundation, hosted by Dr. Jiequn Han, working on the inference-time scaling of diffusion models. Summer 2023 and 2024: Applied Scientist Intern at Amazon Science, working on multi-objective LLM fine-tuning and multi-objective optimization.
Education
PhD Candidate: Institute for Computational and Mathematical Engineering, Stanford University; Supervisors: Prof. Lexing Ying (Applied Mathematics) and Prof. Grant M. Rotskoff (Theoretical Chemistry). BSc: School of Mathematical Sciences, Peking University; Supervisor: Prof. Ruo Li.
Background
Research interests lie in the intersection of machine learning, stochastic analysis, and numerical analysis. Currently working on the mathematical foundations and algorithmic design of flow and diffusion-based models.