Understanding the Impact of Neural Variations and Random Connections on Inference

Author:

Zeng Yuan,Ferdous Zubayer Ibne,Zhang Weixiang,Xu Mufan,Yu Anlan,Patel Drew,Post Valentin,Guo Xiaochen,Berdichevsky Yevgeny,Yan Zhiyuan

Abstract

Recent research suggests that in vitro neural networks created from dissociated neurons may be used for computing and performing machine learning tasks. To develop a better artificial intelligent system, a hybrid bio-silicon computer is worth exploring, but its performance is still inferior to that of a silicon-based computer. One reason may be that a living neural network has many intrinsic properties, such as random network connectivity, high network sparsity, and large neural and synaptic variability. These properties may lead to new design considerations, and existing algorithms need to be adjusted for living neural network implementation. This work investigates the impact of neural variations and random connections on inference with learning algorithms. A two-layer hybrid bio-silicon platform is constructed and a five-step design method is proposed for the fast development of living neural network algorithms. Neural variations and dynamics are verified by fitting model parameters with biological experimental results. Random connections are generated under different connection probabilities to vary network sparsity. A multi-layer perceptron algorithm is tested with biological constraints on the MNIST dataset. The results show that a reasonable inference accuracy can be achieved despite the presence of neural variations and random network connections. A new adaptive pre-processing technique is proposed to ensure good learning accuracy with different living neural network sparsity.

Publisher

Frontiers Media SA

Subject

Cellular and Molecular Neuroscience,Neuroscience (miscellaneous)

Reference51 articles.

1. Tensorflow: Large-scale machine learning on heterogeneous distributed systems.;Abadi;arXiv preprint arXiv:1603.04467,2016

2. Dynamics of a recurrent network of spiking neurons before and following learning.;Amit;Network,1997

3. Synaptic scaling rule preserves excitatory–inhibitory balance and salient neuronal network dynamics.;Barral;Nat. Neurosci.,2016

4. Estimating or propagating gradients through stochastic neurons for conditional computation.;Bengio;arXiv preprint arXiv 1308.3432,2013

5. Synaptic modifications in cultured hippocampal neurons: dependence on spike timing synaptic strength and postsynaptic cell type.;Bi;J. Neurosci.,1998

Cited by 1 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3