Affiliation:
1. ETIS Laboratory, CY Cergy-Paris University, ENSEA, CNRS, UMR8051, Cergy, France
2. IPAL, CNRS, Singapore
Abstract
We propose that coding and decoding in the brain are achieved through digital computation using three principles: relative ordinal coding of inputs, random connections between neurons, and belief voting. Due to randomization and despite the coarseness of the relative codes, we show that these principles are sufficient for coding and decoding sequences with error-free reconstruction. In particular, the number of neurons needed grows linearly with the size of the input repertoire growing exponentially. We illustrate our model by reconstructing sequences with repertoires on the order of a billion items. From this, we derive the Shannon equations for the capacity limit to learn and transfer information in the neural population, which is then generalized to any type of neural network. Following the maximum entropy principle of efficient coding, we show that random connections serve to decorrelate redundant information in incoming signals, creating more compact codes for neurons and therefore, conveying a larger amount of information. Henceforth, despite the unreliability of the relative codes, few neurons become necessary to discriminate the original signal without error. Finally, we discuss the significance of this digital computation model regarding neurobiological findings in the brain and more generally with artificial intelligence algorithms, with a view toward a neural information theory and the design of digital neural networks.
Publisher
Proceedings of the National Academy of Sciences
Reference84 articles.
1. H. Barlow, “Possible principles underlying the transformation of sensory messages” in Sensory Communication, W. Rosenblith, Ed. (MIT Press, Cambridge, MA, 1961), pp. 217–234.
2. Information Theory and the Brain
3. Communication in Neuronal Networks
4. Sparse coding of sensory inputs
5. The neuronal encoding of information in the brain
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献