Publications: Google Scholar profile; Workshops Organized: TTIC summer workshop, Incentives for Collaborative Learning and Data Sharing, Theoretical Advances in Federated Learning; Teaching Activities: UAI'23 Tutorial on Online Optimization meets Federated Learning; Reviewer: STOC'21, TMLR, JMLR, ICML'21'22'24, NeurIPS'21'22'23'24, ICLR'22'23'24, AISTATS'22'23, Springer MLJ; Session Chair: ICML'22, NeurIPS'22; Volunteer: IJCAI'24, ICML'20, ICLR'20; Awards: Top Reviewer at ICLR'22, ICML'22, NeurIPS'22; Projects: NSF-Simon's research collaboration on Mathematics of Deep Learning (MoDL).
Research Experience
Summer 2023: Research Intern at Sony AI Privacy Preserving Machine Learning team, Collaborators: Nidham Gazagnadou and Lingjuan Lyu; Summer 2020: Applied Scientist Intern at Codeguru, Amazon Web Services.
Education
PhD: Toyota Technological Institute at Chicago (TTIC), Advisors: Prof. Nati Srebro and Prof. Lingxiao Wang; B.Tech: Indian Institute of Technology, Kanpur, Computer Science and Engineering, Advisor: Prof. Purushottam Kar; Academic Exchange: École Polytechnique Fédérale de Lausanne (EPFL), Machine Learning and Optimization Laboratory (MLO), Advisor: Prof. Martin Jaggi.
Background
Research Interests: Collaborative learning, theoretical guarantees for optimization, privacy of distributed algorithms. About Me: Postdoctoral associate at Yale FDS, focusing on proving theoretical guarantees for optimization and ensuring the privacy of distributed algorithms amid data and systems heterogeneity.
Miscellany
Personal Interests: Actively looking for collaborators at Yale and beyond.