Yash Sharma
Scholar

Yash Sharma

Google Scholar ID: AlGCn8wAAAAJ
University of Tübingen, Max Planck Institute for Intelligent Systems
Artificial IntelligenceMachine Learning
Citations & Impact
All-time
Citations
7,399
 
H-index
20
 
i10-index
21
 
Publications
20
 
Co-authors
20
list available
Publications
20 items
Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
  • ['NeurIPS Compositional Learning Workshop 2024: "Pretraining Frequency Predicts Compositional Generalization of CLIP on Real-World Tasks" (*equal contribution)', 'NeurIPS 2024: "No 'Zero-Shot' Without Exponential Data: Pretraining Concept Frequency Determines Multimodal Model Performance"', 'EMNLP 2024: "Attribute Diversity Determines the Systematicity Gap in VQA" (†senior author)', 'NeurIPS 2023: "On Transfer of Adversarial Robustness from Pre-training to Downstream Tasks" (†senior author)', 'ICML 2023: "Provably Learning Object-Centric Representations" (*equal contribution)', 'TMLR 2023: "Jacobian-based Causal Discovery with Nonlinear ICA"', 'ICML Pre-training Workshop 2022: "Pixel-level Correspondence for Self-Supervised Learning from Video"', 'CLeaR 2022: "Disentanglement via Mechanism Sparsity Regularization: A New Principle for Nonlinear ICA"', 'NeurIPS 2021: "Unsupervised Learning of Compositional Energy Concepts"', 'Demonstrated that pretraining frequency predicts CLIP’s compositional generalization on real-world tasks', 'Showed exponential data scaling yields linear gains in zero-shot performance', 'Revealed attribute diversity reduces systematicity gap in VQA', 'Studied transfer of adversarial robustness from pretraining to downstream tasks', 'Developed provable methods for object-centric representation learning', 'Advanced causal discovery using Jacobian in nonlinear ICA', 'Proposed pixel-level correspondence via optical flow for video-based self-supervised learning', 'Introduced mechanism sparsity regularization for disentanglement in nonlinear ICA', 'Formulated unsupervised compositional concept learning via energy-based models']