Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
Proposed the MetaFormer hypothesis 'MetaFormer is actually what you need' to challenge the common belief 'Attention is all you need', and built an embarrassingly simple model PoolFormer to verify it. Received academic awards such as Snap Research Fellowship and Google TRC Researcher Spotlight. Published papers include 'MetaFormer Is Actually What You Need for Vision' (CVPR 2022 Oral) and 'MM-Vet: Evaluating Large Multimodal Models for Integrated Capabilities' (ICML 2024).
Research Experience
Currently a Research Scientist at ByteDance Seed; Previously a research intern at Amazon AWS AI Lab, Microsoft Azure AI, and Sea AI Lab
Education
PhD: National University of Singapore (NUS), Advisor: Prof. Xinchao Wang, and worked closely with Prof. Shuicheng Yan and Prof. Jiashi Feng; M.Eng.: Sun Yat-sen University (SYSU), Advisor: Prof. Liang Lin; B.S.: South China Normal University (SCNU)
Background
Research interests: Deep Learning, Multimodal AI; Languages: Teochew, Mandarin, English, C, Python