- Published in top-tier conferences like EACL, EMNLP, and EuroVis
- Received recognition such as MVP at Huawei Noah’s Ark Lab
- Example publications: UniChart - A Universal Vision-language Pretrained Model for Chart Comprehension and Reasoning; S2D - Sorted Speculative Decoding For More Efficient Deployment of Nested Large Language Models
Research Experience
- AI Researcher at Huawei Canada, introduced Sorted LLaMA, a method for integrating nested submodels into a single LLM, and developed a confidence-based early exiting mechanism to accelerate inference.
Education
- MSc in Computer Science from York University, supervised by Professor Enamul Hoque, focusing on natural language interactions with visualizations.
- BSc in Computer Engineering from Amirkabir University of Technology, exploring deep learning for anomaly detection and text chunking.
Background
An NLP researcher specializing in the development and optimization of Large Language Models (LLMs). His research interests include efficient LLM training, inference acceleration, and multi-modal systems integrating NLP and computer vision.