Published multiple papers, including 'Theory-Informed Improvements to Classifier-Free Guidance for Discrete Diffusion Models', 'What Exactly Does Guidance Do in Masked Discrete Diffusion Models', etc.; Participated in several research projects such as 'Evaluating the Design Space of Diffusion-based Generative Models', 'A separation in Heavy-tailed Sampling: Gaussian vs. Stable Oracles for Proximal Samplers', etc.
Research Experience
Taught MAT016C — Short Calculus at UC Davis in Summer 2019; Taught MATH1553 — Introduction to Linear Algebra at Georgia Tech in Fall 2023; Taught MATH2552 — Differential Equations at Georgia Tech in Spring 2024.
Education
Ph.D. in Mathematics from the University of California, Davis, advised by Prof. Krishna Balasubramanian; Currently a Hale Visiting Assistant Professor in the School of Mathematics at the Georgia Institute of Technology, hosted by Prof. Molei Tao.
Background
Research interests: Mathematical foundations of modern AI, sampling, and diffusion models. Specializes in scalable inference methods, including sampling, diffusion models, and stochastic optimization.
Miscellany
Contact: Email yhe367@gatech.edu; Office Skiles 016; Profile links: Google Scholar