Published multiple papers, including 'Accelerated Parameter-Free Stochastic Optimization' (COLT 2024) and 'The Price of Adaptivity in Stochastic Convex Optimization' (COLT 2024, best paper award). Involved in several projects such as 'DataComp-LM: In Search of the Next Generation of Training Sets for Language Models'.
Research Experience
Currently an assistant professor at Tel Aviv University’s School of Computer Science.
Education
Completed PhD at Stanford University in 2020, advised by John Duchi and Aaron Sidford; previously obtained B.Sc. and M.Sc. from the Technion, Israel Institute of Technology, where he worked with Shlomo Shamai and Tsachy Weissman.
Background
Research interests include machine learning, optimization, and statistics, with a particular focus on understanding and overcoming fundamental limits. Current research focuses on making machine learning and optimization algorithms robust and reliable.
Miscellany
Name pronounced as Yah-ear (with the Hebrew pronunciation of 'r').