Currently conducting research at Stanford on designing neural networks with associative memory.
Education
Before coming to Stanford, studied math, physics, and computer science at Cornell.
Background
A fifth-year PhD candidate in Computer Science at Stanford, working with Emily B. Fox. Most recently, interested in designing neural networks with associative memory, which can make sequence models better at handling long context tasks without blowing up our hardware. His research aims to add explicit structure into neural networks, enabling more data-efficient training and even new capabilities.
Miscellany
Likes to learn a bit of everything, as reflected in the eclectic mix of projects he’s taken on over the years.