Hangbo Bao
Scholar

Hangbo Bao

Google Scholar ID: lXCZGqYAAAAJ
Microsoft Research
Natural Language ProcessingComputer VisionMultimodalRepresentation Learning
Citations & Impact
All-time
Citations
10,020
 
H-index
15
 
i10-index
16
 
Publications
19
 
Co-authors
12
list available
Publications
2 items
Resume (English only)
Academic Achievements
  • Image as a Foreign Language: BEIT Pretraining for All Vision and Vision-Language Tasks (Preprint)
  • A Unified View of Masked Image Modeling (Preprint)
  • BEiT v2: Masked Image Modeling with Vector-Quantized Visual Tokenizers (Preprint)
  • VL-BEiT: Generative Vision-Language Pretraining (Preprint)
  • THE-X: Privacy-Preserving Transformer Inference with Homomorphic Encryption (Findings of ACL, 2022)
  • Corrupted Image Modeling for Self-Supervised Visual Pre-Training (ICLR, 2023)
  • VLMo: Unified Vision-Language Pre-Training with Mixture-of-Modality-Experts (NeurIPS, 2022)
  • BEiT: BERT Pre-Training of Image Transformers (ICLR, 2022) Oral paper
  • Attention Temperature Matters in Abstractive Summarization Distillation (ACL, 2022)
  • s2s-ft: Fine-Tuning Pretrained Transformer Encoders for Sequence-to-Sequence Learning (Preprint)
  • Learning to Sample Replacements for ELECTRA Pre-Training (Findings of ACL, 2021)
  • MiniLMv2: Multi-Head Self-Attention Relation Distillation for Compressing Pretrained Transformers (Findings of ACL, 2021)
  • MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers (NeurIPS, 2020)
  • Unilmv2: Pseudo-masked language models for unified language model pre-training (ICML, 2020)
Research Experience
  • Research Intern in Natural Language Computing at Microsoft Research Asia, Jul. 2016 – Sep. 2017, Mentor: Dr. Furu Wei; Mar. 2018 – Present, Mentors: Dr. Furu Wei & Dr. Li Dong.
Education
  • B.S. in School of Computer Science and Technology, Harbin Institute of Technology, Sept. 2013 - Jul. 2017; Ph.D. student in School of Computer Science and Technology, Harbin Institute of Technology, Sept. 2017 - 2023, Joint Ph.D. program with Microsoft Research Asia.
Background
  • Research Interests: Pre-trained models, natural language processing, representation learning. Currently a final year Ph.D. student at the School of Computer Science and Technology, Harbin Institute of Technology, advised by Prof. Songhao Piao. Also a long-term research intern at Microsoft Research Asia, mentored by Dr. Furu Wei and Dr. Li Dong.
Miscellany
  • No information about personal interests or hobbies is provided on the personal homepage.