🤖 AI Summary
To address the high energy consumption and computational overhead of conventional RNNs on general-purpose hardware for time-series processing, this work proposes a multilayer dynamical hardware recurrent neural network based on spintronic oscillators. Leveraging the intrinsic transient nonlinear dynamics of oscillators, the architecture enables low-power temporal modeling. It introduces, for the first time, the integration of standard backpropagation through time (BPTT) with mainstream machine learning frameworks to train hardware-level spintronic dynamical systems. Furthermore, we establish design principles for oscillator time-constant configuration and hyperparameter selection tailored to multi-timescale dynamics. Numerical simulations demonstrate that the proposed system achieves 89.83±2.91% accuracy on sequential digit classification—comparable to software-based RNNs of equivalent architecture—while delivering significantly improved energy efficiency.
📝 Abstract
The ability to process time-series at low energy cost is critical for many applications. Recurrent neural network, which can perform such tasks, are computationally expensive when implementing in software on conventional computers. Here we propose to implement a recurrent neural network in hardware using spintronic oscillators as dynamical neurons. Using numerical simulations, we build a multi-layer network and demonstrate that we can use backpropagation through time (BPTT) and standard machine learning tools to train this network. Leveraging the transient dynamics of the spintronic oscillators, we solve the sequential digits classification task with $89.83pm2.91~%$ accuracy, as good as the equivalent software network. We devise guidelines on how to choose the time constant of the oscillators as well as hyper-parameters of the network to adapt to different input time scales.