Linearized Bregman Iterations for Sparse Spiking Neural Networks

📅 2026-03-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges of parameter redundancy and limited energy efficiency in spiking neural networks (SNNs) by introducing, for the first time, the convex sparsity-inducing Linearized Bregman Iteration (LBI) into SNN training. The approach integrates an enhanced AdaBreg optimizer—an Adam variant incorporating momentum and bias correction—and leverages Bregman distance minimization along with proximal soft-thresholding updates to enable efficient sparse learning. Evaluated on three neuromorphic benchmarks—SHD, SSC, and PSMNIST—the method achieves accuracy comparable to that of Adam while reducing the number of active parameters by approximately 50%, thereby substantially lowering model complexity and computational overhead.

Technology Category

Application Category

📝 Abstract
Spiking Neural Networks (SNNs) offer an energy efficient alternative to conventional Artificial Neural Networks (ANNs) but typically still require a large number of parameters. This work introduces Linearized Bregman Iterations (LBI) as an optimizer for training SNNs, enforcing sparsity through iterative minimization of the Bregman distance and proximal soft thresholding updates. To improve convergence and generalization, we employ the AdaBreg optimizer, a momentum and bias corrected Bregman variant of Adam. Experiments on three established neuromorphic benchmarks, i.e. the Spiking Heidelberg Digits (SHD), the Spiking Speech Commands (SSC), and the Permuted Sequential MNIST (PSMNIST) datasets, show that LBI based optimization reduces the number of active parameters by about 50% while maintaining accuracy comparable to models trained with the Adam optimizer, demonstrating the potential of convex sparsity inducing methods for efficient neuromorphic learning.
Problem

Research questions and friction points this paper is trying to address.

Spiking Neural Networks
sparsity
parameter efficiency
neuromorphic learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Linearized Bregman Iterations
Spiking Neural Networks
Sparsity Induction
AdaBreg Optimizer
Neuromorphic Learning
🔎 Similar Papers
D
Daniel Windhager
Silicon Austria Labs, Linz, Austria
Bernhard A. Moser
Bernhard A. Moser
SCCH and Institute of Signal Processing, JKU, Austria
Applied MathematicsMachine LearningSpike-based Signal Processing and Learning
M
Michael Lunglmayr
Institute of Signal Processing, Johannes Kepler University, Linz, Austria