Author:
Bird Alex D,Cuntz Hermann
Abstract
AbstractInspired by the physiology of neuronal systems in the brain, artificial neural networks have become an invaluable tool for machine learning applications. However, their biological realism and theoretical tractability are limited, resulting in poorly understood parameters. We have recently shown that biological neuronal firing rates in response to distributed inputs are largely independent of size, meaning that neurons are typically responsive to the proportion, not the absolute number, of their inputs that are active. Here we introduce such a normalisation, where the strength of a neuron’s afferents is divided by their number, to various sparsely-connected artificial networks. The learning performance is dramatically increased, providing an improvement over other widely-used normalisations in sparse networks. The resulting machine learning tools are universally applicable and biologically inspired, rendering them better understood and more stable in our tests.
Publisher
Cold Spring Harbor Laboratory
Reference49 articles.
1. ImageNet classification with deep convolutional neural networks;Advances in Neural Information Processing Systems,2012
2. Sequence to sequence learning with neural networks;Advances in Neural Information Processing Systems,2014
3. End-to-end lung cancer screening with three-dimensional deep learning on low-dose chest computed tomography;Nature Medicine,2019
4. A logical calculus of the ideas immanent in nervous activity
5. The organization of behavior: A neuropsychological theory;Wiley,1949
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献