AgoraResearch hub
ExploreLibraryProfile
Account
Taishi Nakamura
Scholar

Taishi Nakamura

Google Scholar ID: nbPQwgUAAAAJ
Institute of Science Tokyo
artificial general intelligencelarge language modelsmachine learning
Homepage↗Google Scholar↗
Citations & Impact
All-time
Citations
181
 
H-index
6
 
i10-index
5
 
Publications
14
 
Co-authors
14
list available
Contact
No contact links provided.
Publications
12 items
On the Optimal Reasoning Length for RL-Trained Language Models
2026
Cited
0
MixtureVitae: Open Web-Scale Pretraining Dataset With High Quality Instruction and Reasoning Data Built from Permissive-First Text Sources
2025
Cited
0
Open-sci-ref-0.01: open and reproducible reference baselines for language model and dataset comparison
2025
Cited
0
Optimal Sparsity of Mixture-of-Experts Language Models for Reasoning Tasks
2025
Cited
0
Rewriting Pre-Training Data Boosts LLM Performance in Math and Code
2025
Cited
0
Building Instruction-Tuning Datasets from Human-Written Instructions with Open-Weight Large Language Models
2025
Cited
0
Wider or Deeper? Scaling LLM Inference-Time Compute with Adaptive Branching Tree Search
2025
Cited
0
Drop-Upcycling: Training Sparse Mixture of Experts with Partial Re-initialization
2025
Cited
0
Resume (English only)
Co-authors
14 total
Rio Yokota
Rio Yokota
Professor, Institute of Science Tokyo
Kazuki Fujii
Kazuki Fujii
Institute of Science Tokyo
Naoaki Okazaki
Naoaki Okazaki
Institute of Science Tokyo
Takuya Akiba
Takuya Akiba
Sakana AI
So Kuroki
So Kuroki
Sakana AI
Yuichi Inoue
Yuichi Inoue
Sakana AI
Yuki Imajuku
Yuki Imajuku
Sakana AI
Yujin Tang
Yujin Tang
Sakana AI

Welcome back

Sign in to Agora

Welcome back! Please sign in to continue.

Do not have an account?