LUKE, a knowledge-enhanced language model released in 2020, achieved state-of-the-art performance on a wide range of NLP tasks, with related papers receiving over 1,000 citations as of 2025; performed exceptionally in multiple international competitions such as the NIPS 2017 Human-Computer QA Competition, NeurIPS 2020 EfficientQA Competition; achieved top performance in the NEEL Challenge at WWW 2015 and W-NUT Shared Task at ACL 2015; developed Wikipedia2Vec, an open-source tool for learning entity embeddings; planned, co-authored, and supervised a series of introductory Japanese books on large language models.
Research Experience
Conducts research on large language models and pre-trained language models; works on question answering (QA) systems where AI answers questions; specializes in named entity recognition and linking in text.
Education
Ph.D. and Kaggle Master.
Background
Chief Scientist at Studio Ousia Inc., Specially Appointed Professor (Visiting) at the Center for Language AI Research, Tohoku University, Visiting Professor at the Center for Artificial Intelligence, Mathematical, and Data Science, Nagoya University, and Visiting Scientist at RIKEN Center for Advanced Intelligence Project. Passionate about developing innovative technologies that benefit society, with a current focus on natural language processing and large language models.