Affiliation:
1. Computer and Information Science and Engineering Department, University of Florida, Gainesville, FL 32611-6120, U.S.A.
Abstract
We derive a synaptic weight update rule for learning temporally precise spike train–to–spike train transformations in multilayer feedforward networks of spiking neurons. The framework, aimed at seamlessly generalizing error backpropagation to the deterministic spiking neuron setting, is based strictly on spike timing and avoids invoking concepts pertaining to spike rates or probabilistic models of spiking. The derivation is founded on two innovations. First, an error functional is proposed that compares the spike train emitted by the output neuron of the network to the desired spike train by way of their putative impact on a virtual postsynaptic neuron. This formulation sidesteps the need for spike alignment and leads to closed-form solutions for all quantities of interest. Second, virtual assignment of weights to spikes rather than synapses enables a perturbation analysis of individual spike times and synaptic weights of the output, as well as all intermediate neurons in the network, which yields the gradients of the error functional with respect to the said entities. Learning proceeds via a gradient descent mechanism that leverages these quantities. Simulation experiments demonstrate the efficacy of the proposed learning framework. The experiments also highlight asymmetries between synapses on excitatory and inhibitory neurons.
Subject
Cognitive Neuroscience,Arts and Humanities (miscellaneous)
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献