Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
Publications:
- 'A Hessian-Aware Stochastic Differential Equation for Modelling SGD', Preprint, 2024
- 'Achieving Near-Optimal Convergence for Distributed Minimax Optimization with Adaptive Stepsizes', NeurIPS, 2024
- 'Parameter-Agnostic Optimization under Relaxed Smoothness', AISTATS, 2024
- 'Two Sides of One Coin: the Limits of Untuned SGD and the Power of Adaptive Methods', NeurIPS, 2023
- 'TiAda: A Time-scale Adaptive Algorithm for Nonconvex Minimax Optimization', ICLR, 2023
- 'Nest Your Adaptive Algorithm for Parameter-Agnostic Nonconvex Minimax Optimization', NeurIPS, 2022
- 'Adversarial Open-World Person Re-Identification', ECCV, 2018
Research Experience
Head teaching assistant of Optimization for Data Science (fall 2024) at ETH Zurich; Teaching assistant of Optimization for Data Science (fall 2023) at ETH Zurich; Teaching assistant of Computational Intelligence Laboratory (spring 2022) at ETH Zurich; Teaching assistant of Deep Learning (fall 2021) at ETH Zurich.
Education
Ph.D. in Computer Science, ETH Zurich, supervised by Prof. Niao He.
Background
Research Interest: Designing provably efficient machine learning algorithms, as well as providing theoretical understandings and guarantees.
Miscellany
Hobbies: Watching football, a fan of Chelsea. Uses Vim a lot.