Yicheng Fu
Scholar

Yicheng Fu

Google Scholar ID: Phj7N40AAAAJ
Stanford University
Natural Language ProcessingLarge Language Model
Citations & Impact
All-time
Citations
2,102
 
H-index
4
 
i10-index
2
 
Publications
12
 
Co-authors
7
list available
Resume (English only)
Academic Achievements
  • 1. P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Task (ACL 2022)
  • 2. SwiftSage: A Generative Agent with Fast and Slow Thinking for Complex Interactive Tasks (NeurIPS 2023, spotlight)
  • 3. UI-JEPA: Towards Active Perception of User Intent through Onscreen User Activity (Preprint 2024)
  • 4. CAMPHOR: Collaborative Agents for Multi-input Planning and High-Order Reasoning On Device (Preprint 2024)
  • 5. Pause-Tuning for Long-Context Comprehension: A Lightweight Approach to LLM Attention Recalibration (Preprint 2025)
  • 6. EgoNormia: Benchmarking Physical Social Norm Understanding (Preprint 2025)
Research Experience
  • Joined Prof. Diyi Yang's lab before studying as an undergraduate at Tsinghua University.
Education
  • 1. MS student, EE department, Stanford, Advisor: Diyi Yang
  • 2. Undergraduate at Electrical Engineering Department, Tsinghua University
Background
  • Master's student at the Electrical Engineering Department, Stanford University. Broadly interested in natural language processing and artificial intelligence.