2025 preprint 'Denoising Score Distillation: From Noisy Diffusion Pretraining to One-Step High-Quality Generation' proposes training a one-step image generator using only noisy images, achieving FID scores comparable to clean-image-trained diffusion models; demonstrates distillation enhances generation quality beyond mere acceleration
Paper 'Conditional diffusions for amortized neural posterior estimation' accepted by AISTATS 2025
Two papers accepted by NeurIPS 2024: 'Diffusion Policies creating a Trust Region for Offline Reinforcement Learning' and 'Identifying General Mechanism Shifts in Linear Causal Representations'
Published 'iSCAN: identifying causal mechanism shifts among nonlinear additive noise models' at NeurIPS 2023
Published 'Model-based trajectory inference for single-cell rna sequencing using deep learning with a mixture prior' in PNAS
Background
Second-year PhD student at the University of Texas at Austin, supervised by Professor Mingyuan Zhou
Broad research interests in statistical machine learning, specifically: