Paper 'Between Circuits and Chomsky: Pre-pretraining on Formal Languages Imparts Linguistic Biases' at ACL 2025 received Outstanding Paper Award
Co-authored 'OLMo: Accelerating the Science of Language Models' at ACL 2024, awarded Best Theme Paper
Published numerous papers at top venues including ICLR, ICML, EMNLP, COLM, TACL, ACL, and DLT on topics such as transformer expressivity, semantic learning in language models, and state-space models
Key contributions include works like 'Exact Expressive Power of Transformers with Padding', 'A Little Depth Goes a Long Way: The Expressive Power of Log-Depth Transformers', and 'The Expressive Power of Transformers with Chain of Thought'