Hamiltonian Descent Algorithms for Optimization: Accelerated Rates via Randomized Integration Time, NeurIPS 2025 (Spotlight)
From Randomized Hamiltonian Flow to Fast Stochastic Optimization, NeurIPS Workshop 2025
Mean-field Underdamped Langevin Dynamics and its Space-time Discretization, ICML 2024
Accelerated Stochastic Optimization Methods under Quasar-convexity, ICML 2023
The Fractional Laplacian-based Image Inpainting, Inverse Problems and Imaging 2024
Reviewer of AISTATS 2023, ICLR 2025–2026, NeurIPS 2025 (top reviewer)
Background
Research interests: optimization, statistics, and machine learning. Focused on developing provably efficient algorithms for optimization and statistics to improve the training and inference of modern machine learning models.