A Hitchhiker's Guide to Poisson Gradient Estimation

📅 2026-02-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of gradient estimation in Poisson latent variable models arising from discrete stochastic sampling. It systematically compares the Exponential Arrival Time (EAT) simulation and the Gumbel-Softmax (GSM) relaxation, and proposes an improved EAT estimator that is theoretically guaranteed to be first-moment unbiased—exactly matching the firing rate—and substantially reduces second-moment bias, thereby enhancing gradient stability. Experimental results demonstrate that the proposed method consistently outperforms existing approaches in distribution fidelity, gradient quality, and downstream task performance, achieving results nearly indistinguishable from exact gradients while exhibiting greater robustness to hyperparameter choices. This provides practitioners in computational neuroscience and related fields with an efficient and reliable gradient estimation framework.

Technology Category

Application Category

📝 Abstract
Poisson-distributed latent variable models are widely used in computational neuroscience, but differentiating through discrete stochastic samples remains challenging. Two approaches address this: Exponential Arrival Time (EAT) simulation and Gumbel-SoftMax (GSM) relaxation. We provide the first systematic comparison of these methods, along with practical guidance for practitioners. Our main technical contribution is a modification to the EAT method that theoretically guarantees an unbiased first moment (exactly matching the firing rate), and reduces second-moment bias. We evaluate these methods on their distributional fidelity, gradient quality, and performance on two tasks: (1) variational autoencoders with Poisson latents, and (2) partially observable generalized linear models, where latent neural connectivity must be inferred from observed spike trains. Across all metrics, our modified EAT method exhibits better overall performance (often comparable to exact gradients), and substantially higher robustness to hyperparameter choices. Together, our results clarify the trade-offs between these methods and offer concrete recommendations for practitioners working with Poisson latent variable models.
Problem

Research questions and friction points this paper is trying to address.

Poisson latent variable models
gradient estimation
discrete stochastic variables
differentiation through sampling
computational neuroscience
Innovation

Methods, ideas, or system contributions that make the work stand out.

Poisson gradient estimation
Exponential Arrival Time
Gumbel-SoftMax
unbiased gradient
latent variable models
🔎 Similar Papers
No similar papers found.