Ziyi Ye (叶子逸)
Scholar

Ziyi Ye (叶子逸)

Google Scholar ID: M3Qsb6cAAAAJ
Fudan University, Tsinghua University
Information RetrievalLarge Language ModelHuman-AI Interaction
Citations & Impact
All-time
Citations
419
 
H-index
11
 
i10-index
12
 
Publications
20
 
Co-authors
18
list available
Publications
20 items
Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
  • - August 2025, paper titled 'SimVBG: Simulating Individual Values by Backstory Generation' accepted by EMNLP 2025.
  • - May 2025, paper titled 'EEG reveals the cognitive impact of polarized content in short video scenarios' published in Nature Scientific Reports.
  • - April 2025, three co-authored papers accepted by SIGIR 2025.
  • - March 2025, 'Generative Language Reconstruction from Brain Recordings' published in Nature Communications Biology.
  • - September 2024, 'Pre-trained Model for EEG-based Emotion Recognition' won Best Paper Nomination at CCIR 2024.
Research Experience
  • - August 2025 - present, Assistant Professor, Institute of Trustworthy Embodied AI, Fudan University, China.
  • - August 2020 - June 2025, Ph.D. student, Department of Computer Science and Technology, Tsinghua University, China.
Education
  • Ph.D., Department of Computer Science and Technology, Tsinghua University, supervised by Prof. Yiqun Liu.
Background
  • Research Interests: Multimodal computing, Web search, large language models, and embodied AI. Major areas of interest include:
  • - Multimodal Computing: Developing models that can understand and integrate diverse sensory inputs (vision, language, touch, human signals, etc.) to perform complex, human-like tasks.
  • - Human-AI Interaction: Exploring intuitive and effective ways for autonomous agents to collaborate and communicate with humans.
  • - Cognition of AI: Investigating how cognitive abilities and complex behaviors emerge in AI systems through evolutionary processes.
Miscellany
  • Personal Interests: Actively looking for self-motivated students to join his research group.