Scholar
Maciej Stefaniak
Google Scholar ID: tMHPMsEAAAAJ
LLM Researcher, Univerisy of Warsaw
LLM Transformer nn efficiency
pretraining
compression
pruning
distillation
Follow
Homepage
↗
Google Scholar
↗
Citations & Impact
All-time
Citations
3
H-index
1
i10-index
0
Publications
5
Co-authors
0
Contact
No contact links provided.
Publications
4 items
$μ$-Parametrization for Mixture of Experts
2025
Cited
0
Decoupled Relative Learning Rate Schedules
2025
Cited
0
Projected Compression: Trainable Projection for Efficient Transformer Compression
2025
Cited
0
Joint MoE Scaling Laws: Mixture of Experts Can Be Memory Efficient
2025
Cited
0
Resume (English only)
Co-authors
0 total
Co-authors: 0 (list not available)
×
Welcome back
Sign in to Agora
Welcome back! Please sign in to continue.
Email address
Password
Forgot password?
Continue
Do not have an account?
Sign up