Affiliation:
1. Department of Diagnostic Radiology, Yale University School of Medicine and Department of Neuroscience, University of Connecticut Health Center
2. Department of Epidemiology and Public Health and Department of Psychology, Yale University
Abstract
At each moment, we experience a melange of information arriving at several senses, and often we focus on inputs from one modality and ‘reject’ inputs from another. Does input from a rejected sensory modality modulate one's ability to make decisions about information from a selected one? When the modalities are vision and hearing, the answer is “yes”, suggesting that vision and hearing interact. In the present study, we asked whether similar interactions characterize vision and touch. As with vision and hearing, results obtained in a selective attention task show cross-modal interactions between vision and touch that depend on the synesthetic relationship between the stimulus combinations. These results imply that similar mechanisms may govern cross-modal interactions across sensory modalities.
Subject
Artificial Intelligence,Sensory Systems,Experimental and Cognitive Psychology,Ophthalmology
Reference30 articles.
1. Arieh Y, Algom D, (under review) “Selective attention with picture – world stimuli: Neither superiority for word or picture nor Stroop and Garner effects are inevitable” Journal of Experimental Psychology: Learning, Memory and Cognition
2. Processing linguistic and perceptual dimensions of speech: Interactions in speeded classification.
Cited by
90 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献