Author:
Avcu Enes,Hwang Michael,Brown Kevin Scott,Gow David W.
Abstract
IntroductionThe notion of a single localized store of word representations has become increasingly less plausible as evidence has accumulated for the widely distributed neural representation of wordform grounded in motor, perceptual, and conceptual processes. Here, we attempt to combine machine learning methods and neurobiological frameworks to propose a computational model of brain systems potentially responsible for wordform representation. We tested the hypothesis that the functional specialization of word representation in the brain is driven partly by computational optimization. This hypothesis directly addresses the unique problem of mapping sound and articulation vs. mapping sound and meaning.ResultsWe found that artificial neural networks trained on the mapping between sound and articulation performed poorly in recognizing the mapping between sound and meaning and vice versa. Moreover, a network trained on both tasks simultaneously could not discover the features required for efficient mapping between sound and higher-level cognitive states compared to the other two models. Furthermore, these networks developed internal representations reflecting specialized task-optimized functions without explicit training.DiscussionTogether, these findings demonstrate that different task-directed representations lead to more focused responses and better performance of a machine or algorithm and, hypothetically, the brain. Thus, we imply that the functional specialization of word representation mirrors a computational optimization strategy given the nature of the tasks that the human brain faces.
Funder
National Institute on Deafness and Other Communication Disorders
Reference138 articles.
1. Tracking the time course of spoken word recognition using eye movements: evidence for continuous mapping models;Allopenna;J. Mem. Lang.,1998
2. Interaction between phonological and semantic factors in auditory comprehension;Baker;Neuropsychologia,1981
3. Don't count, predict! a systematic comparison of context-counting vs. context-predicting semantic vectors;Baroni;52nd Annual Meeting of the Association for Computational Linguistics,2014
4. Distributional memory: a general framework for corpus-based semantics;Baroni;Comput. Linguist.,2010
5. Representation learning: a review and new perspectives;Bengio;IEEE Trans. Pattern Anal. Mach. Intell.,2013
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献