Selected Publications: 'RoBERTa: A Robustly Optimized BERT Pretraining Approach', 'BART: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension', etc.
Research Experience
Facebook / Meta Inc, Menlo Park, CA: Applied Research Scientist Manager (Jul 2018 - Nov 2022), Research Scientist (Jan 2013 - Jul 2018). Project highlights include developing the MultiRay service to run multiple very large and accurate models, working on XLM-R multilingual model research and applications, and improving the RoBERTa pretraining approach and its application in identifying violations. Assistant Research Scientist at CLSP, Johns Hopkins University (Oct 2010 - Jan 2013), conducting research on Machine Learning for Structured Prediction.
Education
PhD in Computer Science from Cornell University (Aug 2010), Advisor: Prof. Claire Cardie; MSc in Computer Science from Cornell University (Aug 2006); Honors BSc, with Distinction in Computer Science from the University of Delaware (May 2002), Graduated Summa Cum Laude; GPA: 4.00/4.00. Minors in Mathematics and Cognitive Science.
Background
Applied Research Leader with a track record of innovating in AI and NLP to solve real-world problems. Led teams to create industry-standard pretraining and large LM methods such as RoBERTa, XLM-R and MultiRay. Applied these methods to improve online experiences, e.g., reduce the prevalence of hate speech and bullying posts. Experienced in building and motivating high-performing diverse teams and mentoring researchers and engineers.