Dachuan Shi
Scholar

Dachuan Shi

Google Scholar ID: ejECvlYAAAAJ
Georgia Tech
AI ReasoningEfficient Deep LearningVision and Language
Citations & Impact
All-time
Citations
437
 
H-index
6
 
i10-index
5
 
Publications
10
 
Co-authors
0
 
Resume (English only)
Academic Achievements
  • Paper: SwiReasoning: Switch-Thinking in Latent and Explicit for Pareto-Superior Reasoning LLMs, Preprint, 2025
  • Paper: LaCache: Ladder-Shaped KV Caching for Efficient Long-Context Modeling of Large Language Models, ICML, 2025
  • Paper: CrossGET: Cross-Guided Ensemble of Tokens for Accelerating Vision-Language Transformers, ICML, 2024
  • Paper: UPop: Unified and Progressive Pruning for Compressing Vision-Language Transformers, ICML, 2023
  • Paper: Mitigating Forgetting Between Supervised and Reinforcement Learning Yields Stronger Reasoners, Preprint, 2025
  • Paper: Superficial Self-Improved Reasoners Benefit from Model Merging, EMNLP, 2025
  • Paper: Supervised Fine-tuning in turn Improves Visual Foundation Models, Preprint, 2024
  • Paper: Masked Generative Distillation, ECCV, 2022
Research Experience
  • Research Intern @ Microsoft, Redmond, WA, USA, 2025; Research Intern @ Shanghai AILab, Shanghai, CN, 2022–24; TA @ CS4476 Computer Vision, Georgia Tech, Spring 2025.
Education
  • Currently a second-year CS Ph.D. student at Georgia Tech, working with Prof. Wenke Lee. Previously, received M.S. and B.S. in CS from Tsinghua University.
Background
  • Working on LLMs/MLLMs reasoning and inference, with a focus on efficiency and alignment.
Miscellany
  • Honors: Tsinghua Outstanding Master's Thesis, 2024; Tsinghua Outstanding Bachelor's Thesis, 2021. Services: Reviewer for multiple conferences including ICML, NeurIPS, ICLR, CVPR, ICCV, ECCV, ACL ARR, and TPAMI.
Co-authors
0 total
Co-authors: 0 (list not available)