Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
Selected Publications: 'Topological Invariance and Breakdown in Learning' (arXiv), 'A universal compression theory: Lottery ticket hypothesis and superpolynomial scaling laws' (arXiv), 'Neural Thermodynamics I: Entropic Forces in Deep and Universal Representation Learning' (NeurIPS 2025), 'Heterosynaptic Circuits Are Universal Gradient Machines' (Preprint 2025). Conference Papers: ICLR 2022, ICML 2024, NeurIPS 2025, etc. Served as an Area Chair for NeurIPS and ICLR.
Research Experience
Works with Prof. Isaac Chuang at MIT and collaborates with Prof. Tomaso Poggio in the BCS department. Focuses on the theoretical foundation of deep learning. Previously, pursued a PhD in Physics at the University of Tokyo.
Education
PhD in Physics from the University of Tokyo, supervised by Prof. Masahito Ueda; Bachelor's degree in Physics and Mathematics from Carnegie Mellon University
Background
Research Interests: Scientific principles of artificial intelligence, theories of deep learning, universal laws of AI, grand unified theory of intelligence. Brief Introduction: A researcher at MIT and NTT Research, focusing on the intersection of mathematics, physics, neuroscience, and artificial intelligence.
Miscellany
Personal interests include art, literature, and philosophy. Enjoys playing Go.