Abstract
ABSTRACTTo find an object we are looking for, we must recognize it. Prevailing models of visual search neglect recognition, focusing instead on selective attention mechanisms. These models account for performance limitations that participants exhibit when searching highly simplified stimuli often used in laboratory tasks. However, it is unclear how to apply these models to complex natural images of real-world objects. Deep neural networks (DNN) can be applied to any image, and recently have emerged as state-of-the-art models of object recognition in the primate ventral visual pathway. Using these DNN models, we ask whether object recognition explains limitations on performance across visual search tasks. First, we show that DNNs exhibit a hallmark effect seen when participants search simplified stimuli. Further experiments show this effect results from optimizing for object recognition: DNNs trained from randomly-initialized weights do not exhibit the same performance limitations. Next, we test DNN models of object recognition with natural images, using a dataset where each image has a visual search difficulty score, derived from human reaction times. We find DNN accuracy is inversely correlated with visual search difficulty score. Our findings suggest that to a large extent visual search performance is explained by object recognition.
Publisher
Cold Spring Harbor Laboratory
Reference107 articles.
1. There is no such thing as attention;Frontiers in Psychology,2011
2. Top-down versus bottom-up attentional control: a failed theoretical dichotomy
3. Repeated Measures Correlation;Frontiers in Psychology,2017
4. Visual search for colour targets that are or are not linearly separable from distractors
5. Bekolay, T. , Bergstra, J. , Hunsberger, E. , DeWolf, T. , Stewart, T. C. , Rasmussen, D. , Choo, X. , Voelker, A. R. , & Eliasmith, C. (2014). Nengo: A Python tool for building large-scale functional brain models. Frontiers in Neuroinformatics, 7.https://doi.org/10.3389/fninf.2013.00048
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献