Dan Biderman
Scholar

Dan Biderman

Google Scholar ID: 6WFbVJUAAAAJ
Stanford University
Machine LearningTheoretical NeuroscienceCognitive Science
Citations & Impact
All-time
Citations
597
 
H-index
8
 
i10-index
8
 
Publications
20
 
Co-authors
7
list available
Resume (English only)
Academic Achievements
  • “LoRA Learns Less and Forgets Less” published in TMLR 2024 (Featured Certification)
  • “Minions: Cost-efficient Collaboration Between On-device and Cloud Language Models” accepted at ICML 2025
  • Led development of Lightning Pose package; paper published in Nature Methods 2024
  • “Partitioning variability in animal behavioral videos using semi-supervised variational autoencoders” published in PLoS Computational Biology 2021
  • Recipient of Columbia’s Titus M Cowan Dissertation Prize in Biomedical Research
  • Student speaker at Columbia’s 2025 PhD hooding ceremony
Research Experience
  • Postdoc at Linderman Lab (Stanford Statistics) and Hazy Research (Ré) Lab (Stanford CS)
  • Developed deep learning models for animal pose tracking in videos during PhD at Columbia’s Center for Theoretical Neuroscience (Lightning Pose project)
  • Collaborated with Databricks Mosaic AI on learning-forgetting tradeoffs in parameter-efficient finetuning
  • Proposed new collaboration patterns between on-device and cloud LLMs (Minions project)
  • Co-organizes the workshop on Efficient Systems for Foundation Models (most recently at ICML 2025)
Background
  • Postdoctoral Scholar at Stanford University, jointly affiliated with Statistics and Computer Science
  • Co-advised by Christopher Ré and Scott Linderman
  • Builds resource-efficient AI systems and applies them to neuroscience
  • Current focus: language models that dynamically learn from experience
  • Research interests: Efficient LLMs and multi-agent systems; hardware-aware numerical linear algebra and ML; modeling and analysis of biological data