🤖 AI Summary
Leaky Integrate-and-Fire (LIF) neurons suffer from limited representational capacity due to linear membrane potential decay, while Quadratic Integrate-and-Fire (QIF) neurons—though biologically more expressive—exhibit training instability in deep spiking neural networks (SNNs). Method: We propose the first discretized QIF neuron model tailored for high-performance deep SNNs. By rigorously deriving an analytical formula for a parameterized surrogate gradient window, we substantially mitigate gradient mismatch, ensuring training stability and scalability. Our approach integrates biologically inspired nonlinear dynamics, timestep-parameterized analysis, and backpropagation-based training. Contribution/Results: Evaluated on CIFAR-10/100, ImageNet, and CIFAR-10 DVS, our method consistently outperforms LIF baselines in both accuracy and dynamic representation capability. It establishes, for the first time, the feasibility and effectiveness of stable, trainable QIF neurons in large-scale SNNs.
📝 Abstract
Spiking Neural Networks (SNNs) have emerged as energy-efficient alternatives to traditional artificial neural networks, leveraging asynchronous and biologically inspired neuron dynamics. Among existing neuron models, the Leaky Integrate-and-Fire (LIF) neuron has become widely adopted in deep SNNs due to its simplicity and computational efficiency. However, this efficiency comes at the expense of expressiveness, as LIF dynamics are constrained to linear decay at each timestep. In contrast, more complex models, such as the Quadratic Integrate-and-Fire (QIF) neuron, exhibit richer, nonlinear dynamics but have seen limited adoption due to their training instability. On that note, we propose the first discretization of the QIF neuron model tailored for high-performance deep spiking neural networks and provide an in-depth analysis of its dynamics. To ensure training stability, we derive an analytical formulation for surrogate gradient windows directly from our discretizations' parameter set, minimizing gradient mismatch. We evaluate our method on CIFAR-10, CIFAR-100, ImageNet, and CIFAR-10 DVS, demonstrating its ability to outperform state-of-the-art LIF-based methods. These results establish our discretization of the QIF neuron as a compelling alternative to LIF neurons for deep SNNs, combining richer dynamics with practical scalability.