Publications: 'From Acceleration to Saturation: Scaling Behavior of Bootstrapped Language Model Pretraining' (October 2025); 'Optimal Variance and Covariance Estimation under Differential Privacy in the Add-Remove Model and Beyond' (September 2025); Paper accepted to ICML 2025. Awards: Recipient of DBSJ Kambayashi Young Researcher Award; Marie Skłodowska-Curie ESR Fellowship; JSPS (Japan Society for the Promotion of Science) Research Fellowship for Young Scientists; MEXT Scholarship.
Research Experience
Worked as a Marie Skłodowska-Curie ESR at the Technical University of Munich and as a researcher at NEC Corporation.
Education
Ph.D. in Physics from the University of Tokyo in 2017; M.S. in Physics from the University of Tokyo in 2014; B.S. in Physics from the University of Tokyo in 2012.
Background
A research scientist at SB Intuitions/LY/LINE Corporation working on machine learning. Previously a particle physicist. Current interests include training large language models, security and privacy issues of machine learning, statistics, theoretical computer science, and physics and mathematics in general.
Miscellany
Based in Tokyo, Japan. Social media: LinkedIn, Twitter, Github, Google Scholar.