Abstract
ABSTRACTHumans and other animals effortlessly identify sounds and categorize them into behaviorally relevant categories. Yet, the acoustic features and neural transformations that enable the formation of perceptual categories are largely unknown. Here we demonstrate that correlation statistics between frequency-organized cochlear sound channels are reflected in the neural ensemble activity of the auditory midbrain and that such activity, in turn, can contribute to discrimination of perceptual categories. Using multi-channel neural recordings in the auditory midbrain of unanesthetized rabbits, we first demonstrate that neuron ensemble correlations are highly structured in both time and frequency and can be decoded to distinguish sounds. Next, we develop a probabilistic framework for measuring the nonstationary spectro-temporal correlation statistics between frequency organized channels in an auditory model. In a 13-category sound identification task, classification accuracy is consistently high (>80%), improving with sound duration and plateauing at ~ 1-3 seconds, mirroring human performance trends. Nonstationary short-term correlation statistics are more informative about the sound category than the time-average correlation statistics (84% vs. 73% accuracy). When tested independently, the spectral and temporal correlations between the model outputs achieved a similar level of performance and appear to contribute equally. These results outline a plausible neural code in which correlation statistics between neuron ensembles of different frequencies can be read-out to identify and distinguish acoustic categories.
Publisher
Cold Spring Harbor Laboratory
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献