Jiang Hu
Scholar

Jiang Hu

Google Scholar ID: WIlpQFwAAAAJ
YMSC, Tsinghua University
Optimizationmachine learning
Citations & Impact
All-time
Citations
811
 
H-index
11
 
i10-index
11
 
Publications
20
 
Co-authors
15
list available
Contact
No contact links provided.
Publications
20 items
Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
  • [9/2025] One paper titled ‘Adaptive Riemannian ADMM for Nonsmooth Optimization: Optimal Complexity without Smoothing’ is accepted by NeurIPS 2025.
  • [9/2025] One paper titled ‘Non-convex composite federated learning with heterogeneous data’ is accepted by Automatica.
  • [9/2025] One paper titled ‘Decentralized projected Riemannian gradient method for smooth optimization on compact submanifolds embedded in the Euclidean space’ is accepted by Numerische Mathematik.
  • [9/2025] One paper titled ‘Oracle complexity of augmented Lagrangian methods for nonsmooth manifold optimization’ is accepted by Mathematics of Operations Research.
  • [3/2025] One paper titled ‘On the local convergence of the semismooth Newton method for composite optimization’ is accepted by Journal of Scientific Computing.
  • [2/2025] One paper titled ‘Achieving Local Consensus over Compact Submanifolds’ is accepted by IEEE Transactions on Automatic Control.
  • [1/2025] One paper titled ‘An Augmented Lagrangian Primal-Dual Semismooth Newton Method for Multi-block Composite Optimization’ is accepted by Journal of Scientific Computing.
  • [12/2024] One paper titled ‘Decentralized projected Riemannian stochastic recursive momentum method for nonconvex optimization’ is accepted by AAAI 2025.
  • [9/2024] One paper titled ‘Nonconvex Federated Learning on Compact Smooth Submanifolds With Heterogeneous Data’ is accepted by NeurIPS 2024.
  • [7/2024] Our manuscript titled ‘Improving the communication in decentralized manifold optimization through single-step consensus and compression’ is on arxiv.
  • [5/2024] One paper titled ‘Convergence analysis of an adaptively regularized natural gradient method’ is accepted by IEEE Transaction on Signal Processing.
  • [4/2024] Honored to receive the Best Paper Award at ICASSP 2024, 1 out of 2826 accepted papers.
  • [2/2024] One paper titled ‘A projected semismooth Newton method for a class of nonconvex composite programs with strong prox-regularity’ is accepted by Journal of Machine Learning Research.
  • [1/2024] One paper titled ‘Riemannian Natural Gradient Methods’ is published at SIAM Journal on Scientific Computing.
Research Experience
  • Currently an Assistant Professor at the Yau Mathematical Sciences Center, Tsinghua University.
Education
  • Ph.D. from Peking University, held research positions at the Chinese University of Hong Kong, Harvard Medical School, and the University of California, Berkeley.
Background
  • Assistant professor at the Yau Mathematical Sciences Center, Tsinghua University. Current research focuses on mathematical optimization and its applications in machine learning and artificial intelligence.