Abstract
AbstractInformation experiences complex transformation processes in the brain, involving various errors. A daunting and critical challenge in neuroscience is to understand the origin of these errors and their effects on neural information processing. While previous efforts have made substantial progresses in studying the information errors in bounded, unreliable and noisy transformation cases, it still remains elusive whether the neural system is inherently error-free under an ideal and noise-free condition. This work brings the controversy to an end with a negative answer. We propose a novel neural information confusion theory, indicating the widespread presence of information confusion phenomenon after the end of transmission process, which originates from innate neuron characteristics rather than external noises. Then, we reformulate the definition of zero-error capacity under the context of neuroscience, presenting an optimal upper bound of the zero-error transformation rates determined by the tuning properties of neurons. By applying this theory to neural coding analysis, we unveil the multi-dimensional impacts of information confusion on neural coding. Although it reduces the variability of neural responses and limits mutual information, it controls the stimulus-irrelevant neural activities and improves the interpretability of neural responses based on stimuli. Together, the present study discovers an inherent and ubiquitous precision limitation of neural information transformation, which shapes the coding process by neural ensembles. These discoveries reveal that the neural system is intrinsically error-prone in information processing even in the most ideal cases.Author summaryOne of the most central challenges in neuroscience is to understand the information processing capacity of the neural system. Decades of efforts have identified various errors in nonideal neural information processing cases, indicating that the neural system is not optimal in information processing because of the widespread presences of external noises and limitations. These incredible progresses, however, can not address the problem about whether the neural system is essentially error-free and optimal under ideal information processing conditions, leading to extensive controversies in neuroscience. Our work brings this well-known controversy to an end with a negative answer. We demonstrate that the neural system is intrinsically error-prone in information processing even in the most ideal cases, challenging the conventional ideas about the superior neural information processing capacity. We further indicate that the neural coding process is shaped by this innate limit, revealing how the characteristics of neural information functions and further cognitive functions are determined by the inherent limitation of the neural system.
Publisher
Cold Spring Harbor Laboratory
Reference45 articles.
1. Kandel ER , Schwartz JH , Jessell TM , of Biochemistry D, Jessell MBT , Siegelbaum S , et al. Principles of neural science. vol. 4. McGraw-hill New York; 2000.
2. Solso RL , MacLin MK , MacLin OH . Cognitive psychology. Pearson Education New Zealand; 2005.
3. Ghahramani Z . Information theory. Encyclopedia of Cognitive Science. 2006;.
4. Entropy and information in neural spike trains: Progress on the sampling problem;Physical Review E,2004
5. Information theory and neural coding