Multiplayer Federated Learning: Reaching Equilibrium with Less Communication (Manuscript)
Optimal Acceleration for Minimax and Fixed-Point Problems is Not Unique (ICML 2024 Spotlight)
Censored Sampling of Diffusion Models Using 3 Minutes of Human Feedback (NeurIPS 2023)
Diffusion Probabilistic Models Generalize when They Fail to Memorize (ICMLW 2023)
Accelerated Minimax Algorithms Flock Together (SIAM Journal on Optimization)
Robust Probabilistic Time Series Forecasting (AISTATS 2022)
Accelerated Algorithms for Smooth Convex-Concave Minimax Problems with O(1/k^2) Rate on Squared Gradient Norm (ICML 2021 Long talk)
WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points (ICML 2021)
Research Experience
Rufus Isaacs Postdoctoral Fellow, Department of Applied Mathematics & Statistics, Johns Hopkins University
Background
Rufus Isaacs Postdoctoral Fellow at Johns Hopkins University, Department of Applied Mathematics & Statistics. Studies optimization and theoretical aspects of machine learning.