Timestep-Compressed Attack on Spiking Neural Networks through Timestep-Level Backpropagation

📅 2025-08-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing gradient-based adversarial attack methods for spiking neural networks (SNNs) suffer from high latency due to reliance on multi-timestep iterative optimization, hindering real-time security evaluation. To address this, we propose a timestep compression attack framework that uniquely exploits the intrinsic temporal dynamics of SNNs. Specifically, we introduce timestep-level backpropagation (TLBP) and adversarial membrane potential reuse (A-MPR), enabling early stopping and initialization acceleration. The framework is compatible with standard FGSM and PGD paradigms and supports both white-box and black-box settings. Evaluated on CIFAR-10, CIFAR-100, and CIFAR10-DVS, our method reduces attack latency by up to 56.6% (white-box) and 57.1% (black-box) over state-of-the-art approaches, while maintaining comparable attack success rates. This represents a significant efficiency breakthrough beyond conventional ANN-inspired attack methods, overcoming their fundamental latency bottlenecks in SNN security assessment.

Technology Category

Application Category

📝 Abstract
State-of-the-art (SOTA) gradient-based adversarial attacks on spiking neural networks (SNNs), which largely rely on extending FGSM and PGD frameworks, face a critical limitation: substantial attack latency from multi-timestep processing, rendering them infeasible for practical real-time applications. This inefficiency stems from their design as direct extensions of ANN paradigms, which fail to exploit key SNN properties. In this paper, we propose the timestep-compressed attack (TCA), a novel framework that significantly reduces attack latency. TCA introduces two components founded on key insights into SNN behavior. First, timestep-level backpropagation (TLBP) is based on our finding that global temporal information in backpropagation to generate perturbations is not critical for an attack's success, enabling per-timestep evaluation for early stopping. Second, adversarial membrane potential reuse (A-MPR) is motivated by the observation that initial timesteps are inefficiently spent accumulating membrane potential, a warm-up phase that can be pre-calculated and reused. Our experiments on VGG-11 and ResNet-17 with the CIFAR-10/100 and CIFAR10-DVS datasets show that TCA significantly reduces the required attack latency by up to 56.6% and 57.1% compared to SOTA methods in white-box and black-box settings, respectively, while maintaining a comparable attack success rate.
Problem

Research questions and friction points this paper is trying to address.

Reducing multi-timestep attack latency in SNNs
Improving efficiency of gradient-based adversarial attacks
Enabling practical real-time attack applications
Innovation

Methods, ideas, or system contributions that make the work stand out.

Timestep-level backpropagation enables early stopping
Adversarial membrane potential reuse pre-calculates warm-up
Framework reduces attack latency by over 50%
🔎 Similar Papers
No similar papers found.