Reconstructing Spiking Neural Networks Using a Single Neuron with Autapses

📅 2026-03-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes a time-delayed autaptic spiking neural network (TDA-SNN) to address the high communication overhead and state storage costs inherent in conventional multi-layer dense SNN architectures. By incorporating time-delayed autaptic connections within a single leaky integrate-and-fire neuron, TDA-SNN unifies reservoir computing, multilayer perceptron, and convolution-like operations through prototype learning and internal temporal state reuse. This approach represents the first integration of multiple SNN computational paradigms within a single neuron, substantially reducing both neuron count and state memory requirements while enhancing the information capacity per neuron. Evaluated on sequence modeling, event-based, and image classification benchmarks, TDA-SNN achieves competitive performance, demonstrating its favorable space–time trade-off and offering a highly compact computational unit for brain-inspired computing.

Technology Category

Application Category

📝 Abstract
Spiking neural networks (SNNs) are promising for neuromorphic computing, but high-performing models still rely on dense multilayer architectures with substantial communication and state-storage costs. Inspired by autapses, we propose time-delayed autapse SNN (TDA-SNN), a framework that reconstructs SNNs with a single leaky integrate-and-fire neuron and a prototype-learning-based training strategy. By reorganizing internal temporal states, TDA-SNN can realize reservoir, multilayer perceptron, and convolution-like spiking architectures within a unified framework. Experiments on sequential, event-based, and image benchmarks show competitive performance in reservoir and MLP settings, while convolutional results reveal a clear space--time trade-off. Compared with standard SNNs, TDA-SNN greatly reduces neuron count and state memory while increasing per-neuron information capacity, at the cost of additional temporal latency in extreme single-neuron settings. These findings highlight the potential of temporally multiplexed single-neuron models as compact computational units for brain-inspired computing.
Problem

Research questions and friction points this paper is trying to address.

Spiking Neural Networks
Neuromorphic Computing
Model Compression
Autapses
Temporal Multiplexing
Innovation

Methods, ideas, or system contributions that make the work stand out.

spiking neural networks
autapse
single-neuron computing
temporal multiplexing
neuromorphic computing
🔎 Similar Papers
No similar papers found.
W
Wuque Cai
Brain-Apparatus Communication Institute, University of Electronic Science and Technology of China, Chengdu, China
H
Hongze Sun
Brain-Apparatus Communication Institute, University of Electronic Science and Technology of China, Chengdu, China
Quan Tang
Quan Tang
Pengcheng Laboratory
Computer VisionAnomaly DetectionDeep Learning
S
Shifeng Mao
Brain-Apparatus Communication Institute, University of Electronic Science and Technology of China, Chengdu, China
Zhenxing Wang
Zhenxing Wang
Finisar Corporation
Fiber Optic Communicaions
Jiayi He
Jiayi He
Master student, Hefei University of Technology
Sign Language
D
Duo Chen
School of Artificial Intelligence, Chongqing University of Education, Chongqing, China
Dezhong Yao
Dezhong Yao
University of Electronic Science and Technology of China
Chief Editor,Brain-Apparatus Communication
D
Daqing Guo
Brain-Apparatus Communication Institute, University of Electronic Science and Technology of China, Chengdu, China