Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
Publications:
- OVERT: A Benchmark for Over-Refusal Evaluation on Text-to-Image Models (NeurIPS 2025, D&B Track)
- Generalization or Hallucination? Understanding Out-of-Context Reasoning in Transformers (NeurIPS 2025)
- Understanding and Improving Fast Adversarial Training against ℓ0 Bounded Perturbations (NeurIPS 2025)
- Reinforcement Learning for Flow-Matching Policies (arXiv preprint, 2025)
- On the Power of Convolution Augmented Transformer (AAAI 2025)
- From Self-Attention to Markov Models: Unveiling the Dynamics of Generative Transformers (ICML 2024)
- Mechanics of Next Token Prediction with Self-Attention (AISTATS 2024)
Research Experience
Worked with Prof. Samet Oymak at the University of Michigan on the theoretical foundations of self-attention and hybrid models.
Education
Bachelor's Degree: Graduated from the Department of Electrical Engineering, City University of Hong Kong in June 2023, supervised by Prof. Rosa CHAN; Currently a second-year PhD student at UC Berkeley EECS, supervised by Prof. Somayeh Sojoudi, and also works closely with Prof. Song Mei.
Background
Research Interests: Establishing the theoretical foundations for generative models to enhance their efficiency and reliability. Field: Information Engineering.