Ekansh Sharma
Scholar

Ekansh Sharma

Google Scholar ID: YLf2td8AAAAJ
Google
Citations & Impact
All-time
Citations
28
 
H-index
2
 
i10-index
2
 
Publications
9
 
Co-authors
7
list available
Resume (English only)
Academic Achievements
  • Sparse Training from Random Initialization: Aligning Lottery Ticket Masks using Weight Symmetry (ICML 2025)
  • The Non-Local Model Merging Problem: Permutation Symmetries and Variance Collapse (arXiv:2410.12766, submitted)
  • Simultaneous Linear Connectivity of Neural Networks Modulo Permutation (ECML PKDD 2024)
  • Bootstrap estimators for the tail-index and for the count statistics of graphex processes (Electronic Journal of Statistics, 2021)
  • Approximations in Probabilistic Programs (PROBPROG 2020)
  • Exchangeable modelling of relational data: checking sparsity, train-test splitting, and sparse exchangeable Poisson matrix factorization (arXiv:1712.02311, 2020)
Background
  • Final-year PhD candidate in the Department of Computer Science at the University of Toronto, advised by Dan Roy
  • Affiliated with the Vector Institute
  • Primarily works in deep learning, with broad interests spanning methodology (algorithms) and theory for machine learning
  • Current research exploits geometric properties of deep neural network loss landscapes to design novel algorithms for parsimonious machine learning
  • Aims to develop time-efficient and resource-conscious methods to democratize and broaden ML deployment
  • Previously explored probabilistic programming, Bayesian nonparametrics, and computational biology