Author:
Soupizet Thomas,Jouni Zalfa,Wang Siqi,Benlarbi-Delai Aziz,M. Ferreira Pietro
Abstract
Different from classical artificial neural network which processes digital data, the spiking neural network (SNN) processes spike trains. Indeed, its event-driven property helps to capture the rich dynamics the neurons have within the brain, and the sparsity of collected spikes helps reducing computational power. Novel synthesis framework is proposed and an algorithm is detailed to guide designers into deep learning and energy-efficient analog SNN using MNIST. An analog SNN composed of 86 electronic neurons (eNeuron) and 1238 synapses interacting through two hidden layers is illustrated. Three different models of eNeurons implementations are tested, being (Leaky) Integrate-and-Fire (LIF), Morris Lecar (ML) simplified (simp.) and biomimetic (bio.). The proposed SNN, coupling deep learning and ultra-low power, is trained using a common machine learning system (Tensor- Flow) for the MNIST. LIF eNeurons implementations present some limitations and weakness in terms of dynamic range. Both ML eNeurons achieve robust accuracy which is approximately of 0.82.
Publisher
Journal of Integrated Circuits and Systems
Subject
Electrical and Electronic Engineering
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献