Yifan Yang
Scholar

Yifan Yang

Google Scholar ID: PX2IQxsAAAAJ
University of California, Santa Barbara
LLM/VLM Post-trainingEfficiencyAgentic AI
Citations & Impact
All-time
Citations
143
 
H-index
6
 
i10-index
6
 
Publications
13
 
Co-authors
1
list available
Publications
13 items
Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
  • - Publications:
  • - SharpZO work accepted at NeurIPS 2025
  • - Internship paper on LLM pruning highlighted on Amazon Science
  • - AdaZeta paper accepted by EMNLP 2024
  • - LoRETTA paper selected as an oral presentation (top 5%) at NAACL 2024
  • - PID Control-Based Self-Healing paper accepted by TMLR
  • - Two papers on LLM pruning and parameter-efficient federated fine-tuning accepted to ACL 2025 Findings
  • - Preprints:
  • - FLAT-LLM: Fine-grained Low-rank Activation Space Transformation for Large Language Model Compression
  • - A Gradient-based Approach for Online Robust Deep Neural Network Training with Noisy Labels
  • - Particle-based Online Bayesian Sampling (submitted to TMLR)
Research Experience
  • - Applied Scientist Intern at AWS Agentic AI (June 2025 - Sep 2025): Enhancing the planning capabilities of multimodal AI agents
  • - Applied Scientist Intern at Amazon AGI (June 2024 - Sep 2024): Working on inference speed-up of large-scale LLMs
Education
  • - PhD: Department of Computer Science, University of California, Santa Barbara (UCSB), in progress
  • - MS: Department of Computer Science, University of California, Santa Barbara (UCSB)
  • - B.E.: Electronic and Information Engineering, Huazhong University of Science and Technology (HUST), China
Background
  • - Research Interests: Text and multimodal large language models (LLMs), including post-training, model compression, and multimodal AI agents
  • - Field: Computer Science, Natural Language Processing
  • - Bio: PhD candidate in the Department of Computer Science at UC Santa Barbara (UCSB), funded by Amazon AGI to work on LLM efficiency
Miscellany
  • - Actively looking for full-time opportunities starting in Summer 2026!