Mark Ibrahim
Scholar

Mark Ibrahim

Google Scholar ID: AqYyoCMAAAAJ
Fundamental AI Research, Meta AI
Artificial IntelligenceDeep LearningGeneralization
Citations & Impact
All-time
Citations
2,506
 
H-index
21
 
i10-index
32
 
Publications
20
 
Co-authors
24
list available
Resume (English only)
Academic Achievements
  • ‘AbstentionBench: Reasoning LLMs Fail on Unanswerable Questions’ (NeurIPS 2025): Evaluates LLMs’ abstention ability; cited in OpenAI’s GPT-5 technical report and used by UK Security Institute’s Inspect Evals.
  • ‘The Factorization Curse: Which Tokens You Predict Underlie the Reversal Curse and More’ (NeurIPS 2024): Shows that predicting multiple tokens improves knowledge retrieval and planning in maze navigation.
  • ‘X-Sample Contrastive Loss: Improving Contrastive Learning with Sample Similarity Graphs’ (ICLR 2025): Proposes a graph-based contrastive loss encoding inter-sample relationships.
  • ‘Does Progress On Object Recognition Benchmarks Improve Real-World Generalization?’ (ICLR 2024): Demonstrates that benchmark progress fails to reduce geographic disparities in model performance.
  • ‘Shortcuts Come in Multiples Where Mitigating One Amplifies Others’ (CVPR 2023): Studies how deep learning models handle multiple shortcuts simultaneously.
  • ‘ImageNet-X: Understanding Model Mistakes with Factor of Variation Annotations’ (ICLR Spotlight 2023): Analyzes common strengths and vulnerabilities across over 2,200 models.
  • ‘Global Explanations for Neural Networks: Mapping the Landscape of Predictions’ (ACM AAAI 2019): Includes an open-source library and blog post.
  • Co-author of ‘A Cookbook of Self-Supervised Learning’ with Yann LeCun, Randall Balestriero, and others.
  • Gave talk ‘Self Supervised Learning: The Final Frontier of AI’ at Simons Flatiron Institute (April 2025).
  • Oral lightning talk on latent space prediction at NeurIPS Self-Supervised Learning Workshop (Dec 2024).
  • Presented ‘Occam's Razor: What's sufficient for learning good self-supervised representations?’ at Brown University.