Invited talk at the 2022 INFORMS Annual Meeting in the session 'Bilevel Stochastic Methods for Optimization and Learning'
Background
Ph.D. candidate in the Department of Electrical and Computer Engineering at The Ohio State University
Research interests include optimization methods for modern machine learning (ML), robust ML, and deep learning theory
Theoretically focuses on building new frameworks—especially through bilevel optimization—for modern ML applications such as meta-learning, adversarial training, and hyperparameter optimization
Practically aims to design new algorithms with provable guarantees and develop efficient implementations
Recently focuses on optimization for task-specific adaptation of large language models (LLMs), via direct fine-tuning (when allowed) or instruction optimization for black-box LLMs
Affiliated with the NSF AI Institute for Future Edge Networks and Distributed Intelligence (AI-EDGE)