Tatsuya Aoyama
Scholar

Tatsuya Aoyama

Google Scholar ID: AvhNwScAAAAJ
Research Scientist, Meta
language modelingpretraining dynamicsinterpretabilitycognitive science
Citations & Impact
All-time
Citations
151
 
H-index
6
 
i10-index
4
 
Publications
17
 
Co-authors
9
list available
Resume (English only)
Academic Achievements
  • Published 'Language Models Grow Less Humanlike beyond Phase Transition' at ACL 2025 (with Ethan Wilcox)
  • Published 'Anything Goes? A Crosslinguistic Study of (Im)possible Language Learning in LMs' at ACL 2025 (with Xiulin Yang, Yuekun Yao, Ethan Wilcox)
  • Published 'Modeling Nonnative Sentence Processing with L2 Language Models' at EMNLP 2024 (with Nathan Schneider)
  • Published 'Probe-Less Probing of BERT’s Layer-Wise Linguistic Knowledge with Masked Word Prediction' at NAACL-SRW 2022 (with Nathan Schneider)
  • Co-authored 'Predicting the Formation of Induction Heads' at CogInterp @ NeurIPS 2025 (with Ethan Wilcox and Nathan Schneider)
  • Co-authored 'Unpacking Let Alone: Human-Scale Models Generalize to a Rare Construction in Form but not Meaning' at EMNLP 2025
  • Co-authored 'GDTB: Genre Diverse Data for English Shallow Discourse Parsing across Modalities, Text Types, and Domains' at EMNLP 2024
  • Contributed to 'Global PIQA: Evaluating Physical Commonsense Reasoning Across 100+ Languages and Cultures' on arXiv 2025