Random Feature Spiking Neural Networks

๐Ÿ“… 2025-10-01
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Spiking neural networks (SNNs) offer significant energy efficiency advantages but face a fundamental training challenge: their non-differentiable, sparse spiking dynamics impede gradient-based end-to-end optimization. To address this, we propose S-SWIMโ€”a gradient-free, interpretable, random-feature-driven training algorithm that systematically incorporates random feature mechanisms into spike response modeling for the first time, eliminating reliance on surrogate gradients or differentiable approximations of the spiking function. Unlike conventional random weight initialization, S-SWIM enables data-adaptive, rapid, and stable training. Experimental results demonstrate that S-SWIM achieves high accuracy on time-series forecasting tasks. It functions effectively both as a standalone training method and as a principled initialization strategy for gradient-based SNN training. By decoupling learning from spike differentiability constraints, S-SWIM establishes a novel, efficient training paradigm for SNNs.

Technology Category

Application Category

๐Ÿ“ Abstract
Spiking Neural Networks (SNNs) as Machine Learning (ML) models have recently received a lot of attention as a potentially more energy-efficient alternative to conventional Artificial Neural Networks. The non-differentiability and sparsity of the spiking mechanism can make these models very difficult to train with algorithms based on propagating gradients through the spiking non-linearity. We address this problem by adapting the paradigm of Random Feature Methods (RFMs) from Artificial Neural Networks (ANNs) to Spike Response Model (SRM) SNNs. This approach allows training of SNNs without approximation of the spike function gradient. Concretely, we propose a novel data-driven, fast, high-performance, and interpretable algorithm for end-to-end training of SNNs inspired by the SWIM algorithm for RFM-ANNs, which we coin S-SWIM. We provide a thorough theoretical discussion and supplementary numerical experiments showing that S-SWIM can reach high accuracies on time series forecasting as a standalone strategy and serve as an effective initialisation strategy before gradient-based training. Additional ablation studies show that our proposed method performs better than random sampling of network weights.
Problem

Research questions and friction points this paper is trying to address.

Overcoming SNN training difficulties from non-differentiable spiking mechanisms
Enabling SNN training without approximating spike function gradients
Providing effective initialization for gradient-based SNN training methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adapting Random Feature Methods to Spiking Neural Networks
Training SNNs without approximating spike function gradient
Using S-SWIM algorithm for end-to-end SNN training
๐Ÿ”Ž Similar Papers
No similar papers found.
M
Maximilian Gollwitzer
School of Computation, Information and Technology, Technical University of Munich, Garching bei Mรผnchen, 85748, Germany
Felix Dietrich
Felix Dietrich
Professor for Physics-Enhanced Machine Learning, Technical University of Munich
numerical analysismachine learningcomplex dynamical systems