TaeHo Yoon
Scholar

TaeHo Yoon

Google Scholar ID: f0B1Zr8AAAAJ
Johns Hopkins University
OptimizationMachine learning theory
Citations & Impact
All-time
Citations
385
 
H-index
8
 
i10-index
8
 
Publications
12
 
Co-authors
0
 
Resume (English only)
Academic Achievements
  • Multiplayer Federated Learning: Reaching Equilibrium with Less Communication (Manuscript)
  • Optimal Acceleration for Minimax and Fixed-Point Problems is Not Unique (ICML 2024 Spotlight)
  • Censored Sampling of Diffusion Models Using 3 Minutes of Human Feedback (NeurIPS 2023)
  • Diffusion Probabilistic Models Generalize when They Fail to Memorize (ICMLW 2023)
  • Accelerated Minimax Algorithms Flock Together (SIAM Journal on Optimization)
  • Robust Probabilistic Time Series Forecasting (AISTATS 2022)
  • Accelerated Algorithms for Smooth Convex-Concave Minimax Problems with O(1/k^2) Rate on Squared Gradient Norm (ICML 2021 Long talk)
  • WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points (ICML 2021)
Research Experience
  • Rufus Isaacs Postdoctoral Fellow, Department of Applied Mathematics & Statistics, Johns Hopkins University
Background
  • Rufus Isaacs Postdoctoral Fellow at Johns Hopkins University, Department of Applied Mathematics & Statistics. Studies optimization and theoretical aspects of machine learning.
Miscellany
  • Github, Twitter, Scholar
Co-authors
0 total
Co-authors: 0 (list not available)