Xuezhe Ma
Scholar

Xuezhe Ma

Google Scholar ID: 6_MQLIcAAAAJ
Information Sciences Institute, University of Southern California
Natural Language ProcessingMachine LearningDeep Generative ModelsDependency Parsing
Citations & Impact
All-time
Citations
9,165
 
H-index
31
 
i10-index
42
 
Publications
20
 
Co-authors
79
list available
Resume (English only)
Academic Achievements
  • Published multiple papers including 'Megalodon: Efficient LLM Pretraining and Inference with Unlimited Context Length' and others; Received Outstanding Paper Award at EMNLP 2023.
Research Experience
  • Since Fall 2020, Research Lead at Information Sciences Institute, University of Southern California; Focuses on developing efficient unified neural architectures and learning algorithms to learn a universal semantic space for various data modalities, as well as efficient and robust architectures and methods for modeling long-range dependencies in LLMs.
Education
  • Since Fall 2020, Research Assistant Professor at the Department of Computer Science, University of Southern California; Ph.D. from Language Technologies Institute, Carnegie Mellon University, under the supervision of Prof. Eduard Hovy; Master's degree from Center for Brain-like Computing and Machine Intelligence, Shanghai Jiao Tong University, China; Bachelor's degree in Computer Science from Shanghai Jiao Tong University, member of ACM Class, now part of Zhiyuan College in SJTU.
Background
  • Research interests include representation learning techniques based on deep learning methods, aiming to enhance the effectiveness, efficiency, interpretability, and robustness of representation learning. Special focus on the efficiency of multi-modal large language models (LLMs), efficient and robust long-context modeling in LLMs, and applications and evaluation methods for multi-modal LLMs on long sequential data.