Fanxu Meng 孟繁续
Scholar

Fanxu Meng 孟繁续

Google Scholar ID: xvfuhRUAAAAJ
Peking University
Machine LearningModel Compression
Citations & Impact
All-time
Citations
629
 
H-index
9
 
i10-index
8
 
Publications
14
 
Co-authors
5
list available
Publications
14 items
Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
  • Selected Publications:
  • - TransMLA: Multi-Head Latent Attention Is All You Need (NeurIPS 2025 spotlight)
  • - HD-PiSSA: High-Rank Distributed Orthogonal Adaptation (EMNLP 2025 Oral)
  • - CLOVER: Cross-Layer Orthogonal Vectors Pruning and Fine-Tuning (ICML 2025)
  • - PiSSA: Principal Singular values and Singular vectors Adaptation (NeurIPS 2024 spotlight)
  • - RM -R ./Removing Residual Connection Equivalently (Arxiv Preprint)
  • - Pruning Filter in Filter (NeurIPS 2020)
  • - Filter Grafting for Deep Neural Networks (CVPR 2020)
Research Experience
  • Interned and later worked full-time at Tencent YouTu for over two years, collaborating with Xing Sun, Hao Cheng, Ke Li, and Di Yin.
Education
  • Ph.D.: Institute for Artificial Intelligence, Peking University, Advisor: Prof. Muhan Zhang; M.S.: Harbin Institute of Technology, Shenzhen, Advisor: Prof. Guangming Lu.
Background
  • Research Interests: Parameter-efficient fine-tuning of large language models (LLMs) and efficient inference for long-context LLMs. Served as a reviewer for leading conferences and journals, including NeurIPS, ICML, ICLR, TPAMI, COLM, AAAI, and IJCAI.