Published numerous papers in top international conferences such as ICML, NeurIPS, EMNLP, ICLR, focusing on topics like deep learning theory, transformer model analysis, data augmentation techniques in Bayesian neural networks, etc.
Research Experience
Engaged in multiple research projects on deep learning, covering both theoretical and practical applications.
Background
Research interests include the theory of deep learning, machine learning inspired by natural intelligence and neurobiology, and deep learning for the sciences, with a focus on astrophysics.
Miscellany
Lab members have received awards including the ETH Medal and Google PhD Fellowship.