Jongwoo Ko
Scholar

Jongwoo Ko

Google Scholar ID: l2jkwHwAAAAJ
Senior Researcher, Microsoft | Ph.D, KAIST AI
Efficient AILarge Language Models
Citations & Impact
All-time
Citations
526
 
H-index
10
 
i10-index
10
 
Publications
20
 
Co-authors
17
list available
Resume (English only)
Academic Achievements
  • Publications: DistiLLM-2: A Contrastive Approach Boosts the Distillation of LLMs, ICML 2025; DistiLLM: Towards Streamlined Distillation for Large Language Models, ICML 2024; Fast and Robust Early-Exiting Framework for Autoregressive Language Models with Synchronized Parallel Decoding, EMNLP 2023; CUDA: Curriculum of Data Augmentation for Long-tailed Recognition, ICLR 2023; FINE Samples for Learning with Noisy Labels, NeurIPS 2021. Awards: Winner, Qualcomm Innovation Fellowship Korea 2024 and 2022; Silver Prize, 30th Samsung Humantech Paper Awards 2023.
Research Experience
  • Senior Research Scientist at Microsoft ASG, Sep. 2025 - Present; Research Intern at Microsoft, Aug. 2024 - Nov. 2024 (12 Weeks); Applied Scientist Intern at Amazon AGI, Apr. 2024 - Jun. 2024.
Education
  • Ph.D. in Graduate School of AI at KAIST, Advisor: Se-Young Yun, Mar. 2020 - Aug. 2025; M.S. in Industrial and Systems Engineering at KAIST, Advisor: Heeyoung Kim, Mar. 2018 - Feb. 2020; B.S. in Department of Industrial and Systems Engineering at KAIST, Magna Cum Laude, Mar. 2014 - Feb. 2018.
Background
  • Research interests include advancing small foundation models, particularly (multi-modal) language models.
Miscellany
  • Contact: (first-name)(last-name)96 [at] gmail [dot] com [CV / Scholar / Github / LinkedIn]