Paper 'PiKE: Adaptive Data Mixing for Multi-Task Learning Under Low Gradient Conflicts' accepted as Spotlight at NeurIPS 2026 (Top 688 out of 21,575 submissions)
Paper 'Synthetic Text Generation for Training Large Language Models via Gradient Matching' accepted at ICML 2025 (equal contribution)
Paper 'Addax: Utilizing Zeroth-Order Gradients to Improve Memory Efficiency and Performance of SGD for Fine-Tuning Language Models' accepted at ICLR 2025
Paper 'Optimal Differentially Private Learning with Public Data' accepted at ICML 2024
Recipient of USC Viterbi School of Engineering Fellowship
Meritorious Winner (Top 697 out of 10,053 teams) in Mathematical Contest in Modeling (2021)
Finalist Award (Top 180 out of 13,753 teams) in Mathematical Contest in Modeling (2020)
Outstanding Winner of SIMIODE Challenge Using Differential Equations Modeling
Best Overall Project at Georgia Tech Capstone Design Expo