Co-authors
5
list available
Resume (English only)
Academic Achievements
- Publications:
- - NeurIPS 2025: PHYSICS, MoE-Gyro, UltraDelta
- - ICCV 2025: FREE-Merging
- - NeurIPS 2024: IntraMix
- - AAAI 2024: DCLP
- - ICDE 2023: AutoTSC
- - VLDB 2021: Assassin
- Preprints:
- - Arxiv 2025: Decouple and Orthogonalize: A Data-Free Framework for LoRA Merging
Research Experience
- During his master's program at Harbin Institute of Technology, he has been involved in multiple research projects, including large language models and multimodal models. Currently, he is interning at the Shanghai AI Lab, conducting research on large language models under the guidance of Dr. Peng Ye, and collaborating closely with Dr. Ganqu Cui.
Education
- 2023.08 - Present, Master Candidate in Computer Science, Harbin Institute of Technology, Advisor: Prof. Hongzhi Wang; 2019.08 - 2023.06, B.Eng. in Computer Science, Harbin Institute of Technology.
Background
- Research Interests: Large Language Models, Multimodal Models, Efficient AI, Neural Architecture Search (NAS). Currently interning at Shanghai AI Lab, conducting research related to large language models.
Miscellany
- Email: shenghez.zheng@gmail.com