Currently a second-year MS student in Electrical Engineering at Stanford University.
Research interests include information theory, generative modeling, image compression, coding theory, and statistical estimation.
Focuses on both theoretical foundations of information theory and its applications in generative modeling of discrete data, image compression using implicit neural representations, algebraic coding for efficient communication/storage, and statistical estimation for improved sampling and inference.
Previously trained as a neuroscientist and remains interested in the mathematical underpinnings of neuroscience methods and better techniques for neural data acquisition, processing, and analysis.
Hopes to eventually apply his work in information theory and machine learning to neuroscience—for example, deploying powerful ML models inside an MRI scanner.