Liam Hodgkinson
Scholar

Liam Hodgkinson

Google Scholar ID: m4iFqOsAAAAJ
University of Melbourne
probabilistic machine learningdeep learning theory
Citations & Impact
All-time
Citations
697
 
H-index
12
 
i10-index
15
 
Publications
20
 
Co-authors
8
list available
Contact
No contact links provided.
Resume (English only)
Academic Achievements
  • Evaluating natural language processing models with generalization metrics that do not need access to any training or testing data (2022, arXiv preprint)
  • The reproducing Stein kernel approach for post-hoc corrected sampling (2020, arXiv preprint)
  • Generalization Bounds using Lower Tail Exponents in Stochastic Optimizers (2022, ICML conference paper)
  • Fat-Tailed Variational Inference with Anisotropic Tail Adaptive Flows (2022, ICML conference paper)
  • Stateful ODE-Nets using Basis Function Expansions (2021, NeurIPS conference paper)
  • Taxonomizing local versus global structure in neural network loss landscapes (2021, NeurIPS conference paper)
  • Noisy recurrent neural networks (2021, NeurIPS conference paper)
  • Fast approximate simulation of finite long-range spin systems (2020, The Annals of Applied Probability)
  • Stochastic Continuous Normalizing Flows (2021, UAI conference paper)
  • Geometric Rates of Convergence for Kernel-based Sampling Algorithms (2021, UAI conference paper)
  • Multiplicative noise and heavy tails in stochastic optimization (2021, ICML conference paper)
  • Shadow Manifold Hamiltonian Monte Carlo (2021, AISTATS conference paper)
  • Implicit Langevin Algorithms for Sampling From Log-concave Densities (2021, Journal of Machine Learning Research)
Research Experience
  • Previously, he was a postdoctoral research scholar at the University of California, Berkeley, and the International Computer Science Institute (ICSI), under the supervision of Michael Mahoney.
Education
  • He obtained his Ph.D. from the University of Queensland, supervised by Prof Philip Pollett and Dr Ross McVinish.
Background
  • A lecturer in Data Science at the School of Mathematics and Statistics at the University of Melbourne. His research interests lie primarily in probabilistic machine learning (including Bayesian methods), theory of deep learning, and robust training schemes for neural networks.