2025: Exploring emergent capabilities of pure Transformers on molecular data (arXiv:2510.02259)
2025: Accelerating molecular dynamics by repurposing generative models via statistical mechanics (ICML)
2025: Distilling large ML force fields into fast, physics-consistent models for molecular dynamics (ICLR)
2024: Principled scaling of neural interatomic potentials (NeurIPS)
2025: Improving stability and timestep length in neural potentials via differentiable Boltzmann estimators (TMLR)
2024: Neural operators with spectral methods and Parseval-based spectral loss for PDEs (ICLR)
2023–2024: PDE-constrained layers in neural networks, scaled via mixture-of-experts (ICLR)
2024: Work on building equivariance into neural networks (ICLR Spotlight)
2024: Publication in Journal of Chemical Information and Modeling (JCIM)
Full publication list available on Google Scholar
Background
Assistant Professor in Chemical Engineering and EECS at UC Berkeley
Member of Berkeley AI Research (BAIR)
Part of the AI+Science group in EECS and the theory group in Chemical Engineering
Faculty scientist in the Applied Mathematics and Computational Research Division at LBNL
Research focuses on developing machine learning methods driven by challenges in natural sciences, especially physics-inspired ML
Key interests: physical inductive biases in learning, ML for scientific problems, enhancing physics-based solvers via differentiable frameworks, and handling distribution shifts in physical sciences
Applications span atomistic and continuum domains, including fluid mechanics and molecular dynamics
Interdisciplinary connections with numerical analysis, dynamical systems, quantum mechanics, computational geometry, optimization, and category theory