[10.2025] Introduced GenCluster, achieving IOI Gold with open-weight LLMs.
[10.2025] Released BigCodeArena.
[08.2025] Released Nemotron-Nano-v2.
[04.2025] Released Nemotron-H, a family of Mamba-Transformer models.
[04.2025] Released OpenCodeInstruct and OpenCodeReasoning.
[03.2025] Will serve as a senior area chair for EMNLP and IJCNLP-AACL 2025.
[01.2025] LibEvolutionEval got accepted at NAACL 2025.
Research Experience
Before joining NVIDIA, he worked at AWS AI Labs focusing on code generation for Amazon Q Developer; during his PhD, he interned at Meta AI, Yahoo Research, Microsoft Research, and Walmart Labs.
Education
PhD in Computer Science from the University of California Los Angeles, supervised by Dr. Kai-Wei Chang.
Background
Research interests include supervised fine-tuning of large language models (LLMs) and synthetic data generation. Currently, he is a Senior Research Scientist at NVIDIA.