Dan Alistarh
Scholar

Dan Alistarh

Google Scholar ID: 75q-6ZQAAAAJ
Professor at IST Austria
Machine LearningAlgorithmsDistributed Computing
Citations & Impact
All-time
Citations
12,822
 
H-index
40
 
i10-index
99
 
Publications
20
 
Co-authors
24
list available
Resume (English only)
Academic Achievements
  • Supported by the Austrian FWF Center of Excellence BILAI
  • Recipient of ERC Proof-of-Concept and Starting Grants
  • Received generous grants from NVIDIA, Google, and Amazon
  • 5 papers accepted to NeurIPS 2025, including one Spotlight: HALO, Compression Scaling Laws, Hogwild! Inference, Quartet, Influence Distillation
  • 4 papers accepted to ICML 2025: AQUA-KV, EvoPress, QuEST, Optimistic Dual Averaging
  • 4 papers accepted to NeurIPS 2024, including one Oral: PV-Tuning, microAdam, Iterative OBS, QuaRot
  • Paper on Marlin (quantized kernels for mixed-precision LLM inference) accepted to PPoPP 2025
  • Two papers accepted to EMNLP 2024: MathadorLM (benchmarking LLM math skills) and QUIK (weight and activation quantization for LLMs)
  • 4 papers accepted to ICML 2024: SPADE, AQLM, RoSA, compressed preconditioners
  • Two papers accepted to MLSys 2024: QMoE, L-GreCo
  • Two papers presented at ICLR 2024: Scaling laws for sparse models (Spotlight), SpQR
  • Three papers at NeurIPS 2023: CAP (unstructured pruning), Variance-Reduction interpretation of Knowledge Distillation, ZipLM (structured pruning)
  • Three papers accepted to ICML 2023: SparseGPT, SparseProp, QSDP
  • Supervised Elias Frantar’s influential work on LLM compression including GPTQ, Marlin, SparseGPT, QMoE; GPTQ models downloaded millions of times on HuggingFace
Research Experience
  • Professor at the Institute of Science and Technology Austria (ISTA)
  • ML Research Lead at Neural Magic, Inc.
  • Postdoctoral Associate at MIT CSAIL, working with Prof. Nir Shavit
  • Researcher at Microsoft Research, Cambridge, UK
  • Researcher at ETH Zurich
  • Visiting Professor at MIT during Fall 2023