Siyu Ren
Scholar

Siyu Ren

Google Scholar ID: jkJDyrkAAAAJ
Shanghai Jiao Tong University
NLP
Citations & Impact
All-time
Citations
285
 
H-index
9
 
i10-index
8
 
Publications
16
 
Co-authors
6
list available
Resume (English only)
Academic Achievements
  • Published several papers in the field of natural language processing and model compression. Some of the papers include:
  • - A Controlled Study on Long Context Extension and Generalization in LLMs (COLM 2025)
  • - Symbol-LLM: Towards Foundational Symbol-centric Interface For Large Language Models (ACL 2024)
  • - On the Efficacy of Eviction Policy for Key-Value Constrained Generative Language Model Inference (Arxiv)
  • - EMO: Earth Mover Distance Optimization for Auto-regressive Language Modeling (ICLR 2024)
  • - Context Compression for Auto-regressive Transformers with Sentinel Tokens (EMNLP 2023)
  • - Zero-shot Faithfulness Evaluation for Text Summarization with Foundation Language Model (EMNLP 2023)
  • - Combating Short Circuit Behavior in Natural Language Reasoning: Crossover and Mutation Operations for Enhanced Robustness (ECAI 2023)
  • - Pruning Pre-trained Language Models with Principled Importance and Self-regularization (ACL 2023 Findings)
  • - Low-Rank Prune-And-Factorize for Language Model Compression (COLING 2024)
  • - Taxonomy of Abstractive Dialogue Summarization: Scenarios, Approaches and Future Directions (ACM Computing Survey)
  • - Specializing Pre-trained Language Models for Better Relational Reasoning via Network Pruning (NAACL-HLT 2022 Findings)
  • - Leaner and Faster: Two-Stage Model Compression for Lightweight Text-Image Retrieval (NAACL-HLT 2022)
  • - Knowledge-driven distractor generation for cloze-style multiple choice questions (AAAI 2021)
  • - Multi-turn Response Selection using Dialogue Dependency Relations (EMNLP 2020)
Research Experience
  • Currently engaged in R&D related to large language models and multi-modal models at Meituan.
Education
  • Ph.D. from Shanghai Jiao Tong University, graduated in 2024, advised by Kenny Q. Zhu.
Background
  • Research interests include natural language processing, efficient language models, model compression, knowledge distillation, pruning, and inference acceleration.
Miscellany
  • Reviewer for several academic conferences including ICLR, COLM, ACL, NAACL, EMNLP, and AAAI.
  • Awards: GuangHua Scholarship (2021-2022), Undergraduate Excellent Student Second-Class Scholarship (2017-2018, 2016-2017).