🤖 AI Summary
Von Neumann architectures face severe energy-efficiency bottlenecks in supporting the massive parallel matrix operations required for biologically inspired computing. This work proposes a low-power neuromorphic associative memory based on spintronic devices, implementing a Hopfield network using magnetic tunnel junctions (MTJs) as core elements. Key contributions include: (1) a novel ultra-low-power spin synapse achieving only 17.4% of the power consumption of state-of-the-art designs; (2) a transistor-minimized voltage converter reducing transistor count by 53.3%; and (3) the first dedicated simulator for associative memory, delivering a 5×10⁶× speedup over conventional simulators while matching benchmark memory performance. Collectively, this work establishes a full-stack paradigm—spanning device, circuit, and simulation layers—for scalable, energy-efficient neuromorphic computing.
📝 Abstract
Biologically-inspired computing models have made significant progress in recent years, but the conventional von Neumann architecture is inefficient for the large-scale matrix operations and massive parallelism required by these models. This paper presents Spin-NeuroMem, a low-power circuit design of Hopfield network for the function of associative memory. Spin-NeuroMem is equipped with energy-efficient spintronic synapses which utilize magnetic tunnel junctions (MTJs) to store weight matrices of multiple associative memories. The proposed synapse design achieves as low as 17.4% power consumption compared to the state-of-the-art synapse designs. Spin-NeuroMem also encompasses a novel voltage converter with a 53.3% reduction in transistor usage for effective Hopfield network computation. In addition, we propose an associative memory simulator for the first time, which achieves a 5Mx speedup with a comparable associative memory effect. By harnessing the potential of spintronic devices, this work paves the way for the development of energy-efficient and scalable neuromorphic computing systems.