Abstract
ABSTRACTA key driver of mammalian intelligence is the ability to represent incoming sensory information across multiple abstraction levels. For example, in the visual ventral stream, incoming signals are first represented as low-level edge filters and then transformed into high-level object representations. These same hierarchical structures routinely emerge in artificial neural networks (ANNs) trained for image/object recognition tasks, suggesting that a similar process might underlie biological neural networks. However, the classical ANN training algorithm, backpropagation, is considered biologically implausible, and thus several alternative biologically plausible methods have been developed. For instance, several cortical-inspired ANNs in which the apical dendrite of a pyramidal neuron encodes top-down prediction signals have been proposed. In this case, akin to theories of predictive coding, a prediction error can be calculated locally inside each neuron for updating its incoming weights. Notwithstanding, from a neuroscience perspective, it is unclear whether neurons could compare their apical vs. somatic spiking activities to compute prediction errors. Here, we propose a solution to this problem by adapting the framework of the apical-somatic prediction error to the temporal domain. In particular, we show that if the apical feedback signal changes the postsynaptic firing rate, we can use differential Hebbian updates, a rate-based version of the classical spiking time-dependent plasticity (STDP) updates. To the best of our knowledge, this is the first time a cortical-like deep ANN has been trained using such time-based learning rules. Overall, our work removes a key requirement of biologically plausible models for deep learning that does not align with plasticity rules observed in biology and proposes a learning mechanism that would explain how the timing of neuronal activity can allow supervised hierarchical learning.
Publisher
Cold Spring Harbor Laboratory
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献