Scholar
Taifeng Wang
Google Scholar ID: uKZ_OrYAAAAJ
Principle Researcher, Bytedance
graph learning
large scale pretrain language model
drug design and target discovery
search and
Follow
Homepage
↗
Google Scholar
↗
Citations & Impact
All-time
Citations
24,604
H-index
29
i10-index
52
Publications
20
Co-authors
0
Contact
No contact links provided.
Publications
8 items
Exploring Polyglot Harmony: On Multilingual Data Allocation for Large Language Models Pretraining
2025
Cited
0
TiKMiX: Take Data Influence into Dynamic Mixture for Language Model Pre-training
2025
Cited
0
MuRating: A High Quality Data Selecting Approach to Multilingual Large Language Model Pretraining
2025
Cited
0
MuBench: Assessment of Multilingual Capabilities of Large Language Models Across 61 Languages
2025
Cited
0
MoORE: SVD-based Model MoE-ization for Conflict- and Oblivion-Resistant Multi-Task Adaptation
2025
Cited
0
QuaDMix: Quality-Diversity Balanced Data Selection for Efficient LLM Pretraining
2025
Cited
0
LogicMP: A Neuro-symbolic Approach for Encoding First-order Logic Constraints
International Conference on Learning Representations · 2023
Cited
2
Keywords and Instances: A Hierarchical Contrastive Learning Framework Unifying Hybrid Granularities for Text Generation
Annual Meeting of the Association for Computational Linguistics · 2022
Cited
10
Resume (English only)
Co-authors
0 total
Co-authors: 0 (list not available)
×
Welcome back
Sign in to Agora
Welcome back! Please sign in to continue.
Email address
Password
Forgot password?
Continue
Do not have an account?
Sign up