Andres Potapczynski
Scholar

Andres Potapczynski

Google Scholar ID: Gl1cs3AAAAAJ
New York University
Probabilistic ModelingApproximate InferenceSamplingNumerical Linear AlgebraMachine Learning
Citations & Impact
All-time
Citations
291
 
H-index
8
 
i10-index
8
 
Publications
17
 
Co-authors
11
list available
Resume (English only)
Academic Achievements
  • Paper 'Training Flexible Models of Genetic Variant Effects from Functional Annotations using Accelerated Linear Algebra' accepted at ICML 2025. Proposes DeepWAS, a neural model using fast linear algebra to predict variant effects, outperforming LD score regression.
  • Paper 'Customizing the Inductive Biases of Softmax Attention using Structured Matrices' accepted at ICML 2025. Introduces new attention scoring functions based on high-rank structured matrices (e.g., BTT, MLR).
  • Paper 'Effectively Leveraging Exogenous Information across Neural Forecasters' accepted at NeurIPS TSALM 2024. Develops a decoder method applicable to NBEATS, NHITS, PatchTST, and S4, significantly improving forecasting performance.
  • Paper 'Searching for Efficient Linear Layers over a Continuous Space of Structured Matrices' accepted at NeurIPS 2024. Introduces a continuous parameterization over structured matrices expressible as Einsums and derives key insights on compute-optimal scaling.
  • Conducted systematic studies on replacing dense layers with sub-quadratic structured alternatives for improved efficiency.