Published several papers, including 'Positive Distribution Shift as a Framework for Understanding Tractable Learning' (Submitted), 'Shift is Good: Mismatched Data Mixing Improves Test Performance' (Submitted), 'Weak-to-Strong Generalization Even in Random Feature Networks, Provably' (ICML 2025), 'Overfitting Behaviour of Gaussian Kernel Ridgeless Regression: Varying Bandwidth or Dimensionality' (NeurIPS 2024), and more.
Research Experience
Presented talks at various international conferences, including the Deep Learning Theory Workshop at The Simons Institute for the Theory of Computation.
Education
PhD: Department of Mathematics, University of Chicago, advised by Professor Nathan Srebro and Professor Alexander Razborov; BA: Mathematics, Princeton University.
Background
A fourth-year PhD student at the Department of Mathematics, University of Chicago. Interested in the theory of machine learning, particularly its computational and statistical aspects, as well as the theory of deep learning. Recent work focuses on understanding weak-to-strong generalization and positive distribution shifts.