Published multiple papers in areas such as mixture and hierarchical models, Bayesian nonparametrics, etc. Developed novel model selection procedures for finite and infinite mixture models. Provided comprehensive theories to long-standing open problems about parameter and expert estimation in softmax gating Gaussian mixture of experts. Developed effective training methods for sparse mixture of experts to scale up large-scale AI models.
Research Experience
Assistant Professor in the Department of Statistics and Data Sciences at the University of Texas at Austin. Core member of the Machine Learning Laboratory and senior personnel of the Institute for Foundations of Machine Learning (IFML). Previously, a postdoctoral fellow in the EECS Department at UC Berkeley.
Education
Ph.D. in Statistics from the University of Michigan, Ann Arbor in 2017, advised by Professors Long Nguyen and Ya'acov Ritov. Postdoctoral fellow in the Electrical Engineering and Computer Science (EECS) Department at UC Berkeley, mentored by Professors Michael I. Jordan and Martin J. Wainwright.
Background
Currently an Assistant Professor of Statistics and Data Sciences at the University of Texas at Austin. Also a core member of the Machine Learning Laboratory and senior personnel of the Institute for Foundations of Machine Learning (IFML). Research interests focus on four important aspects of complex and large-scale models and data: 1. Heterogeneity of complex data; 2. Interpretability, efficiency, scalability, and robustness of deep learning and complex machine learning models; 3. Scalability and efficiency of optimal transport for machine learning and deep learning applications; 4. Stability, optimality, and robustness of optimization and sampling algorithms for solving complex statistical machine learning models.
Miscellany
Email: minhnhat@utexas.edu Office: WEL 5.242, 105 E 24th Street Austin, TX 78712