Author:
Beniaguev David,Segev Idan,London Michael
Abstract
AbstractWe introduce a novel approach to study neurons as sophisticated I/O information processing units by utilizing recent advances in the field of machine learning. We trained deep neural networks (DNNs) to mimic the I/O behavior of a detailed nonlinear model of a layer 5 cortical pyramidal cell, receiving rich spatio-temporal patterns of input synapse activations. A Temporally Convolutional DNN (TCN) with seven layers was required to accurately, and very efficiently, capture the I/O of this neuron at the millisecond resolution. This complexity primarily arises from local NMDA-based nonlinear dendritic conductances. The weight matrices of the DNN provide new insights into the I/O function of cortical pyramidal neurons, and the approach presented can provide a systematic characterization of the functional complexity of different neuron types. Our results demonstrate that cortical neurons can be conceptualized as multi-layered “deep” processing units, implying that the cortical networks they form have a non-classical architecture and are potentially more computationally powerful than previously assumed.
Publisher
Cold Spring Harbor Laboratory
Reference51 articles.
1. Bai, Shaojie , J Zico Kolter , and Vladlen Koltun . 2019. “An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling.” Accessed December 23. http://github.com/locuslab/TCN.
2. The role of dendritic inhibition in shaping the plasticity of excitatory synapses
3. Mechanisms Underlying Subunit Independence in Pyramidal Neuron Dendrites;Proceedings of the National Academy of Sciences,2013
4. Dendritic Discrimination of Temporal Input Sequences in Cortical Neurons
5. A Review of the Integrate-and-fire Neuron Model: I. Homogeneous Synaptic Input
Cited by
8 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献