Evgenii Nikishin
Scholar

Evgenii Nikishin

Google Scholar ID: ez9FSEAAAAAJ
OpenAI
Reinforcement LearningMachine LearningDeep Learning
Citations & Impact
All-time
Citations
805
 
H-index
7
 
i10-index
7
 
Publications
13
 
Co-authors
21
list available
Resume (English only)
Academic Achievements
  • Contributor to GPT-5 and OpenAI o3 models; published several papers such as 'Forgetting Transformer: Softmax Attention with a Forget Gate' (ICLR 2025), 'Maxwell’s Demon at Work: Efficient Pruning by Leveraging Saturation of Neurons' (TMLR); presented work at top conferences including NeurIPS, ICLR.
Research Experience
  • Interned with David Silver's RL team at DeepMind during his PhD.
Education
  • PhD in Computer Science from Mila, University of Montreal, advised by Pierre-Luc Bacon and Aaron Courville; previously studied at Cornell University, Higher School of Economics, and Lomonosov Moscow State University.
Background
  • A Member of Technical Staff at OpenAI, building the next generation of reasoning models. His research interests include parameter, compute, and data efficiency in reinforcement learning.
Miscellany
  • Contact information includes email, CV link, and profiles on Google Scholar, Twitter, GitHub.