On the Structure of Stationary Solutions to McKean-Vlasov Equations with Applications to Noisy Transformers, 2025.
Dense associative memory on the Bures-Wasserstein space, 2025.
Finite-Dimensional Gaussian Approximation for Deep Neural Networks: Universality in Random Weights, Bernoulli (under major revision), 2025.
Dense Associative Memory with Epanechnikov energy, NeurIPS (spotlight presentation), 2025.
In-context Learning for Mixture of Linear Regressions: Existence, Generalization and Training Dynamics, Transactions on Machine Learning Research (TMLR), 2025.
Transformers Handle Endogeneity in In-Context Linear Regression, ICLR, 2025.
From Stability to Chaos: Analyzing Gradient Descent Dynamics in Quadratic Regression, Transactions on Machine Learning Research (TMLR), 2024.
Gaussian random field approximation via Stein's method with applications to wide random neural networks, Applied and Computational Harmonic Analysis, 2024.
More papers listed in the original document.
Research Experience
No specific research experience information provided.
Education
No specific educational background information provided.
Background
Associate Professor in the Department of Statistics, Graduate Group in Applied Mathematics, and Graduate Program in Electrical and Computer Engineering at the University of California, Davis. Research interests include deep learning, foundation models, etc.
Miscellany
Support: NSF and CeDAR grants. Office: MSB 4109. Email: kbala@ucdavis.edu.