Yifan Zhang
Scholar

Yifan Zhang

Google Scholar ID: ZGeaK6QAAAAJ
Princeton University
Machine LearningDeep LearningLanguage Models
Citations & Impact
All-time
Citations
780
 
H-index
11
 
i10-index
13
 
Publications
20
 
Co-authors
0
 
Publications
2 items
Resume (English only)
Academic Achievements
  • Tensor Product Attention Is All You Need, NeurIPS 2025 Spotlight
  • Beyond Bradley-Terry Models: A General Preference Model for Language Model Alignment, ICML 2025
  • Autonomous Data Selection with Zero-shot Generative Classifiers for Mathematical Texts, ACL 2025 Findings
  • Augmenting Math Word Problems via Iterative Question Composing, AAAI 2025
  • Beyond Squared Error: Exploring Loss Design for Enhanced Training of Generative Flow Networks, ICLR 2025 Spotlight
  • Cumulative Reasoning with Large Language Models, Transactions on Machine Learning Research (TMLR)
  • SEAL: Simultaneous Label Hierarchy Exploration and Learning, TMLR
  • Information Flow in Self-Supervised Learning, ICML 2024
  • Matrix Information Theory for Self-Supervised Learning, ICML
Background
  • PhD student at Princeton University, focusing on building scalable and capable large language models (LLMs)
  • Research interests include improving LLM reasoning, developing new attention mechanisms and model architectures, and aligning model behavior with human preferences via general preference models
  • Former visiting researcher at UCLA AGI Lab
  • Top Seed researcher with the Seed LLM (Foundation Model) Team, working on LLM pretraining and scaling
  • Currently seeking Summer 2026 internship or residency opportunities at industry AI labs
Co-authors
0 total
Co-authors: 0 (list not available)