Best Paper Award at HCOMP 2024 for 'Investigating What Factors Influence Users’ Rating of Harmful Algorithmic Bias and Discrimination'
Paper 'PrivacyLens' accepted to NeurIPS 2024 Datasets and Benchmarks Track, introducing a novel framework to benchmark unintended privacy leakage in LM agents
Two papers accepted at CSCW 2025 on secret LLM use and ethics of LLM use in HCI research
Received a $50K gift grant from Google for human-centered privacy protection in text input
Awarded NSF SaTC grant ($600K total, $200K personal share) on 'Empathy-Based Privacy Education and Design through Synthetic Persona Data Generation'
Two papers accepted at CHI’24 on LLM privacy invasion and LLM empowerment for multimodal app development
CHI’24 Special Interest Group proposal on 'Human-Centered Privacy Research in the Age of Large Language Models' accepted
Co-authored position paper 'Privacy is Not Just Memorization' (Apr 2025), analyzing 1,322 AI/ML privacy papers and revealing that 92% focus on memorization/chat leaks, only 8% on inference-time privacy
Co-chairing the 1st Workshop on Human-Centered AI Privacy and Security (HAIPS 2025) at CCS 2025 in Taiwan
Background
Assistant Professor at Northeastern University, Khoury College of Computer Sciences
Directs the PEACH (Privacy-Enabling AI and Computer-Human interaction) Lab
Core faculty member at the Cybersecurity and Privacy Institute, Northeastern University
Believes privacy sustains human agency, safe exploration, and authentic expression in a connected world
Focuses on studying and addressing emerging LLM privacy issues from a human-centered perspective
Broad research interests at the intersection of Human-Computer Interaction (HCI), Privacy, and AI
Conducts mixed-methods research to understand privacy challenges in stakeholders’ lived experiences and builds systems to measure, model, and address these issues