A spiking photonic neural network of 40.000 neurons, trained with rank-order coding for leveraging sparsity

📅 2024-11-28
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address scalability, energy efficiency, and multi-task learning bottlenecks in spiking neural networks (SNNs), this work presents the first large-scale photonic SNN built entirely from commercial optical components—comprising 40,000 excitable photonic neurons. Neuronal dynamics are modeled via an enhanced Ikeda map, integrated with delay coding and sparse activation mechanisms. Crucially, we introduce the Simultaneous Perturbation Stochastic Approximation (SPSA) algorithm—the first such application to photonic SNN training—within a reservoir computing framework and a hardware-efficient sparse computation paradigm. On MNIST, the system achieves 83.5% accuracy while activating only 22% of neurons; at 77.5% accuracy, neuron activation drops to just 8.5%, markedly improving energy efficiency and architectural scalability. This work establishes a novel paradigm for biologically inspired photonic neuromorphic hardware.

Technology Category

Application Category

📝 Abstract
piking neural networks are neuromorphic systems that emulate certain aspects of biological neurons, offering potential advantages in energy efficiency and speed by for example leveraging sparsity. While CMOS-based electronic SNN hardware has shown promise, scalability and parallelism challenges remain. Photonics provides a promising platform for SNNs due to the speed of excitable photonic devices standing in as neurons and the parallelism and low-latency of optical signal conduction. Here, we present a photonic SNN comprising 40,000 neurons using off-the-shelf components, including a spatial light modulator and a CMOS camera, enabling scalable and cost-effective implementations for photonic SNN proof of concept studies. The system is governed by a modified Ikeda map, were adding additional inhibitory feedback forcing introduces excitability akin to biological dynamics. Using latency encoding and sparsity, the network achieves 83.5% accuracy on MNIST using 22% of neurons, and 77.5% with 8.5% neuron utilization. Training is performed via liquid state machine concepts combined with the hardware-compatible SPSA algorithm, marking its first use in photonic neural networks. This demonstration integrates photonic nonlinearity, excitability, and sparse computation, paving the way for efficient large-scale photonic neuromorphic systems.
Problem

Research questions and friction points this paper is trying to address.

Spiking Neural Networks
Scalability
Parallel Processing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Photonics Spiking Neural Network
Spatial Light Modulator and CMOS Camera
Efficient Handwritten Digit Recognition
R
Ria Talukder
FEMTO-ST Institute/Optics Department, CNRS - University Franche-Comté, 15B avenue des Montboucons, Besançon Cedex, 25030, France
A
A. Skalli
FEMTO-ST Institute/Optics Department, CNRS - University Franche-Comté, 15B avenue des Montboucons, Besançon Cedex, 25030, France
X
X. Porte
Institute of Photonics, Department of Physics, University of Strathclyde, 99 George str., Glasgow G1 1RD, UK
S
Simon Thorpe
Centre de Recherche Cerveau et Cognition CERCO UMR5549, CNRS—Université Toulouse III, Toulouse, France
Daniel Brunner
Daniel Brunner
CNRS researcher, FEMTO-ST, Optics department, Besancon
Photonic neural networksunconventional computationsemiconductor nonlinear opticscomplex photonicsnonlinear dynamics