🤖 AI Summary
Existing spiking neural networks (SNNs) predominantly employ first-order ordinary differential equations (ODEs) with a single time constant and Markovian dynamics to model neuronal membrane potential, limiting their ability to capture long-range temporal dependencies and fractal dendritic dynamics observed in biological neurons. To address this, we propose Fractional SPIKE Differential Equation Networks (fspikeDE), the first SNN framework incorporating fractional-order differential equations (FDEs) to explicitly model non-Markovian neural dynamics. fspikeDE leverages adjoint sensitivity methods for efficient, end-to-end differentiable training. Evaluated on image and graph benchmarks, fspikeDE achieves significant improvements in classification accuracy while maintaining high energy efficiency and low training memory overhead. Moreover, it demonstrates superior robustness to input noise. By unifying biologically plausible dynamics with principled temporal modeling, fspikeDE establishes a novel paradigm for SNNs with enhanced neurobiological fidelity and expressive temporal representation capacity.
📝 Abstract
Spiking Neural Networks (SNNs) draw inspiration from biological neurons to create realistic models for brain-like computation, demonstrating effectiveness in processing temporal information with energy efficiency and biological realism. Most existing SNNs assume a single time constant for neuronal membrane voltage dynamics, modeled by first-order ordinary differential equations (ODEs) with Markovian characteristics. Consequently, the voltage state at any time depends solely on its immediate past value, potentially limiting network expressiveness. Real neurons, however, exhibit complex dynamics influenced by long-term correlations and fractal dendritic structures, suggesting non-Markovian behavior. Motivated by this, we propose the Fractional SPIKE Differential Equation neural network (fspikeDE), which captures long-term dependencies in membrane voltage and spike trains through fractional-order dynamics. These fractional dynamics enable more expressive temporal patterns beyond the capability of integer-order models. For efficient training of fspikeDE, we introduce a gradient descent algorithm that optimizes parameters by solving an augmented fractional-order ODE (FDE) backward in time using adjoint sensitivity methods. Extensive experiments on diverse image and graph datasets demonstrate that fspikeDE consistently outperforms traditional SNNs, achieving superior accuracy, comparable energy efficiency, reduced training memory usage, and enhanced robustness against noise. Our approach provides a novel open-sourced computational toolbox for fractional-order SNNs, widely applicable to various real-world tasks.