Affiliation:
1. Federal Research Centre “Information and Control” of the Russian Academy of Sciences,
44 Vavilov Str., Moscow 119333, Russia;
Moscow Aviation Institute (National Research University),
4 Volokolamskoe shosse, 4, Moscow, 125993, Russia
Abstract
The property of natural parallelization of matrix-vector operations inherent in memristor crossbars creates opportunities for their effective use in neural network computing. Analog calculations are orders of magnitude faster in comparison to calculations on the central processor and on graphics accelerators. Besides, mathematical operations energy costs are significantly lower. The essential feature of analog computing is its low accuracy. In this regard, studying the dependence of neural network quality on the accuracy of setting its weights is relevant. The paper considers two convolutional neural networks trained on the MNIST (handwritten digits) and CIFAR_10 (airplanes, boats, cars, etc.) data sets. The first convolutional neural network consists of two convolutional layers, one subsample layer and two fully connected layers. The second one consists of four convolutional layers, two subsample layers and two fully connected layers. Calculations in convolutional and fully connected layers are performed through matrix-vector operations that are implemented on memristor crossbars. Sub-sampling layers imply the operation of finding the maximum value from several values. This operation can be implemented at the analog level. The process of training a neural network runs separately from data analysis. As a rule, gradient optimization methods are used at the training stage. It is advisable to perform calculations using these methods on CPU. When setting the weights, 3—4 precision bits are required to obtain an acceptable recognition quality in the case the network is trained on MNIST. 6-10 precision bits are required if the network is trained on CIFAR_10.
Publisher
National University of Science and Technology MISiS
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献