Scholar
Kulin Shah
Google Scholar ID: 67OmLg4AAAAJ
Ph.D. Student, University of Texas at Austin
Machine Learning
Follow
Homepage
↗
Google Scholar
↗
Citations & Impact
All-time
Citations
281
H-index
8
i10-index
7
Publications
19
Co-authors
14
list available
Contact
Email
kulin.shah98@gmail.com
CV
Open ↗
GitHub
Open ↗
LinkedIn
Open ↗
Publications
5 items
Provably Learning Attention with Queries
2026
Cited
0
Masked Diffusion for Generative Recommendation
2025
Cited
0
ReGuidance: A Simple Diffusion Wrapper for Boosting Sample Quality on Hard Inverse Problems
2025
Cited
0
Does Generation Require Memorization? Creative Diffusion Models using Ambient Diffusion
2025
Cited
0
Train for the Worst, Plan for the Best: Understanding Token Ordering in Masked Diffusions
2025
Cited
0
Resume (English only)
Academic Achievements
ICML 2025: 'Train for the Worst, Plan for the Best: Understanding Token Ordering in Masked Diffusions' (co-first author), Outstanding Paper Award
ICML 2025: 'Does Generation Require Memorization? Creative Diffusion using Ambient Diffusion'
COLT 2025: 'Learning general Gaussian mixtures with efficient score matching' (alphabetical order)
NeurIPS 2024: 'Causal Language Modeling can Elicit Search and Reasoning Capabilities on Puzzles'
NeurIPS 2024: 'Unrolled denoising networks provably learn optimal Bayesian inference' (co-first author)
NeurIPS 2023: 'Learning Mixtures of Gaussians Using the DDPM Objective'
NeurIPS 2023: 'Ambient Diffusion: Learning Clean Distributions From Corrupted Data'
AISTATS 2022: 'Learning and Generalization in Overparameterized Normalizing Flows'
UAI 2021: 'RISAN: Robust Instance Specific Deep Abstention Network', selected for oral presentation
AIES 2021: 'Rawlsian Fair Adaptation of Deep Learning Classifiers'
AAAI 2020: 'Online Active Learning of Reject Option Classifiers', selected for oral presentation
AAAI 2019: 'Sparse Reject Option Classifier using Successive Linear Programming', selected for oral presentation
AAAI-19 AFFCON Workshop: 'Ingredients for happiness: Modeling constructs via semi-supervised content driven inductive transfer learning'
Preprint: 'PLUME: Polyhedral Learning Using Mixture of Experts'
Co-authors
14 total
Sitan Chen
Assistant Professor of Computer Science, Harvard University
Co-author 2
Giannis Daras
MIT
Vasilis Kontonis
Microsoft Research
Alexandros G Dimakis
Professor, EECS, UC Berkeley
Yuval Dagan
Tel Aviv University
Aravind Gollakota
Apple
Amit Deshpande
Microsoft Research
×
Welcome back
Sign in to Agora
Welcome back! Please sign in to continue.
Email address
Password
Forgot password?
Continue
Do not have an account?
Sign up