Aleksandar Petrov
Scholar

Aleksandar Petrov

Google Scholar ID: em54BT4AAAAJ
Research Scientist, DeepMind
Machine LearningRobustnessProvenance
Citations & Impact
All-time
Citations
674
 
H-index
9
 
i10-index
9
 
Publications
20
 
Co-authors
0
 
Resume (English only)
Academic Achievements
  • - Publications: Multiple papers accepted at top conferences such as ICLR, NeurIPS, and ICML
  • - Awards: Entropic Award for most surprising negative result at the I Can't Believe It's Not Better workshop at NeurIPS 2023
  • - Projects: Developed a programming language for recurrent models
  • - Contributed to the Alan Turing Institute’s response to the House of Lords Large Language Models Call for Evidence
Research Experience
  • - Google DeepMind: Research Scientist, part of the SynthID team
  • - Motional: Research Intern
  • - Adobe: Research Intern
  • - Meta: Research Intern
  • - FAIR: Research Intern, working on watermarking
  • - University of Oxford: Ph.D. research, supervised by Philip Torr and Adel Bibi
Education
  • - Ph.D.: University of Oxford, Autonomous Intelligent Machines and Systems CDT, supervised by Philip Torr and Adel Bibi
  • - M.Sc.: ETH Zürich, focusing on robotics, machine learning, statistics, and applied category theory. Thesis: Compositional Computational Systems
  • - During his time at ETH, he worked closely with Prof. Emilio Frazzoli's group and was generously funded by the Excellence Scholarship & Opportunity Programme (ESOP).
Background
  • - Research Interests: Understanding the fundamental properties of deep learning systems to make them more reliable, robust, efficient, and broadly beneficial for society.
  • - Professional Field: Multimedia provenance and watermarking
  • - Brief Introduction: Currently working at Google DeepMind on multimedia provenance and watermarking as part of the SynthID effort.
Co-authors
0 total
Co-authors: 0 (list not available)