Published 'Feature Learning beyond the Lazy-Rich Dichotomy: Insights from Representational Geometry' and selected for Spotlight Presentation at ICML 2025.
Research Experience
Current work focuses on developing theoretical frameworks and analysis tools to uncover principles and mechanisms behind neural networks. Uses methods from statistical physics, optimization theory, and high-dimensional statistics to study how patterns of activity in neural networks relate to their performance. Works with Professor SueYeon Chung, applying GLUE to both neuroscience data and machine learning models.
Education
Earned a Ph.D. from Harvard's Theory of Computation Group in 2023, advised by Professor Boaz Barak.
Background
A scientist studying how brains and machines solve complex problems, drawing on tools and perspectives from computer science, physics, and neuroscience. Currently a Flatiron Research Fellow at the Center for Computational Neuroscience at the Flatiron Institute.
Miscellany
Outside of work, enjoys baseball, cooking, reading, playing Go, and listening to classical music.