Rectified-Linear and Recurrent Neural Networks Built with Spin Devices

Abstract

A spin synapse with analog programmability using all charge current is proposed. Compared with using spin current, the proposed all-charge-current synapses can be placed in a larger cross-bar array to form a denser and larger neural network. Using the current summation, DOT product can be realized. We further employ a compact racetrack converter as the neuron to implement a rectified-linear neural network, saving area by 67% and energy by 69% compared with a spin binary-threshold neural network while achieving similar accuracy with MNIST digit recognition benchmark. Storing the domain wall motion in a time-based fashion, a recurrent neural network can be realized for time-involved inference tasks.

Publication
2017 IEEE International Symposium on Circuits and Systems (ISCAS)
Avatar
Kaiyuan Yang
Associate Professor of ECE