Scholar
Reza Bayat
Google Scholar ID: 58et-JMAAAAJ
University of Montreal, Mila
Artificial Intelligence
Machine learning
Follow
Homepage
↗
Google Scholar
↗
Citations & Impact
All-time
Citations
81
H-index
4
i10-index
3
Publications
10
Co-authors
0
Contact
Email
reza.bayat@mila.quebec
Twitter
Open ↗
GitHub
Open ↗
Publications
3 items
Mixture-of-Recursions: Learning Dynamic Recursive Depths for Adaptive Token-Level Computation
2025
Cited
0
Steering Large Language Model Activations in Sparse Spaces
2025
Cited
0
Performance Control in Early Exiting to Deploy Large Models at the Same Cost of Smaller Ones
2024
Cited
0
Resume (English only)
Academic Achievements
Contributed to several cutting-edge AI research projects, including:
- Proposed 'Relaxed Recursive Transformers', a novel LLM compression method using parameter sharing and layer-wise LoRAs, achieving performance comparable to larger models.
- Contributed to 'Adaptive Inference-Time Compute', enabling LLMs to predict mid-generation whether restarting improves output without external models.
- Involved in developing 'Titans', a new architecture with neural long-term memory that scales beyond 2M context windows.
- Participated in 'Large Concept Model (LCM)' research, operating in semantic concept space (SONAR) for cross-lingual zero-shot generalization.
- Contributed to 'One-Minute Video Generation with Test-Time Training', using TTT layers to handle long-range dependencies in video generation.
Co-authors
0 total
Co-authors: 0 (list not available)
×
Welcome back
Sign in to Agora
Welcome back! Please sign in to continue.
Email address
Password
Forgot password?
Continue
Do not have an account?
Sign up