Scholar
Zonglin Yang
Google Scholar ID: cTTRbeMAAAAJ
Ph.D. in Computer Science, Nanyang Technological University
Natural Language Processing
LLMs for Scientific Discovery
Large Reasoning Models
Follow
Homepage
↗
Google Scholar
↗
Citations & Impact
All-time
Citations
474
H-index
12
i10-index
12
Publications
17
Co-authors
20
list available
Contact
No contact links provided.
Publications
14 items
PolyReal: A Benchmark for Real-World Polymer Science Workflows
2026
Cited
0
From 50% to Mastery in 3 Days: A Low-Resource SOP for Localizing Graduate-Level AI Tutors via Shadow-RAG
2026
Cited
0
MOOSE-Star: Unlocking Tractable Training for Scientific Discovery by Breaking the Complexity Barrier
2026
Cited
0
MiroMind-M1: An Open-Source Advancement in Mathematical Reasoning via Context-Aware Multi-Stage Policy Optimization
2025
Cited
0
Optimization of Low-Latency Spiking Neural Networks Utilizing Historical Dynamics of Refractory Periods
2025
Cited
0
Harnessing Large Language Models for Scientific Novelty Detection
2025
Cited
0
Accelerating RLHF Training with Reward Variance Increase
2025
Cited
0
MOOSE-Chem2: Exploring LLM Limits in Fine-Grained Scientific Hypothesis Discovery via Hierarchical Search
2025
Cited
0
Load more
Resume (English only)
Co-authors
20 total
Erik Cambria
Professor @ NTU CCDS & Visiting @ MIT Media Lab
Xinya Du
University of Texas at Dallas, CS; UIUC CS; Cornell University, CS
Dongzhan Zhou
Researcher at Shanghai AI Lab
Wanli Ouyang (欧阳万里)
Shanghai AI Lab & CUHK
Co-author 5
Jinjie Ni
National University of Singapore
Ben Gao
Wuhan University
Wanhao Liu
University of Science and Technology of China
×
Welcome back
Sign in to Agora
Welcome back! Please sign in to continue.
Email address
Password
Forgot password?
Continue
Do not have an account?
Sign up