AgoraResearch hub
ExploreLibraryProfile
Account
Changxin Tian
Scholar

Changxin Tian

Google Scholar ID: QrStHzUAAAAJ
Renmin University of China & Ant Group
Large Language Models
Google Scholar↗
Citations & Impact
All-time
Citations
1,704
 
H-index
9
 
i10-index
9
 
Publications
16
 
Co-authors
4
list available
Contact
No contact links provided.
Publications
8 items
Optimal Expert-Attention Allocation in Mixture-of-Experts: A Scalable Law for Dynamic Model Design
2026
Cited
0
MergeMix: Optimizing Mid-Training Data Mixtures via Learnable Model Merging
2026
Cited
0
Every Activation Boosted: Scaling General Reasoner to 1 Trillion Open Language Foundation
2025
Cited
0
MaP: A Unified Framework for Reliable Evaluation of Pre-training Dynamics
2025
Cited
0
Arrows of Math Reasoning Data Synthesis for Large Language Models: Diversity, Complexity and Correctness
2025
Cited
0
Towards Greater Leverage: Scaling Laws for Efficient Mixture-of-Experts Language Models
2025
Cited
0
WSM: Decay-Free Learning Rate Schedule via Checkpoint Merging for LLM Pre-training
2025
Cited
0
Toward Stable and Consistent Evaluation Results: A New Methodology for Base Model Evaluation
2025
Cited
0
Resume (English only)
Co-authors
4 total
Wayne Xin Zhao
Wayne Xin Zhao
Professor, Renmin University of China
Ji-Rong Wen
Ji-Rong Wen
Gaoling School of Artificial Intelligence, Renmin University of China
Jun Zhou
Jun Zhou
Ant Group, Alibaba Group, Zhejiang University
Zhiqiang Zhang
Zhiqiang Zhang
Ant Group

Welcome back

Sign in to Agora

Welcome back! Please sign in to continue.

Do not have an account?