Multiple papers accepted at top-tier venues including ICLR, NeurIPS, ICML, EMNLP, and TMLR
2025: "(Almost) Free Modality Stitching of Foundation Models" accepted to EMNLP 2025 (Main Conference)
2025: "Celo" and "Meta-learning Optimizers for Communication-Efficient Learning" accepted to TMLR
2025: Invited talk at ICLR 2025 Workshop on Weight Space Learning
2024: "Neural Graphs" accepted as oral at ICLR 2024
2023: Two workshop papers accepted at ICML 2023; "GHN-3" accepted at ICML 2023
2022: PhD thesis approved and published; "Model Zoos" accepted to NeurIPS 2022 Datasets and Benchmarks Track; "Hyper-Representations" accepted to NeurIPS 2022
2021: Two papers accepted at ICCV 2021; two papers accepted at NeurIPS 2021; named "Outstanding Reviewer" at ICCV 2021 (top 5% student reviewers)
Regular reviewer for ICML, NeurIPS, ICLR, CVPR, and other top conferences
Background
Research Scientist at Samsung SAIT AI Lab
Adjunct Professor at the University of Montreal
Research interests include graph neural networks (GNNs), large (language) models (LLMs), optimization, and meta-learning
Applications span computer vision, language modeling, and molecule discovery
Open to supervising graduate students on topics such as neural network weight representation, learning to optimize, LLMs and GNNs for scientific discovery, and compressing/merging large models