Influences of luminance contrast and ambient lighting on visual context learning and retrieval
-
Published:2020-09-04
Issue:8
Volume:82
Page:4007-4024
-
ISSN:1943-3921
-
Container-title:Attention, Perception, & Psychophysics
-
language:en
-
Short-container-title:Atten Percept Psychophys
Author:
Zang Xuelian, Huang Lingyun, Zhu Xiuna, Müller Hermann J., Shi ZhuanghuaORCID
Abstract
AbstractInvariant spatial context can guide attention and facilitate visual search, an effect referred to as “contextual cueing.” Most previous studies on contextual cueing were conducted under conditions of photopic vision and high search item to background luminance contrast, leaving open the question whether the learning and/or retrieval of context cues depends on luminance contrast and ambient lighting. Given this, we conducted three experiments (each contains two subexperiments) to compare contextual cueing under different combinations of luminance contrast (high/low) and ambient lighting (photopic/mesopic). With high-contrast displays, we found robust contextual cueing in both photopic and mesopic environments, but the acquired contextual cueing could not be transferred when the display contrast changed from high to low in the photopic environment. By contrast, with low-contrast displays, contextual facilitation manifested only in mesopic vision, and the acquired cues remained effective following a switch to high-contrast displays. This pattern suggests that, with low display contrast, contextual cueing benefited from a more global search mode, aided by the activation of the peripheral rod system in mesopic vision, but was impeded by a more local, fovea-centered search mode in photopic vision.
Funder
Ludwig-Maximilians-Universität München
Publisher
Springer Science and Business Media LLC
Subject
Linguistics and Language,Sensory Systems,Language and Linguistics,Experimental and Cognitive Psychology
Reference64 articles.
1. Annac, E., Conci, M., Müller, H. J., & Geyer, T. (2017). Local item density modulates adaptation of learned contextual cues. Visual Cognition, 25(1/3), 262–277. 2. Annac, E., Manginelli, A. A., Pollmann, S., Shi, Z., Müller, H. J., & Geyer, T. (2013). Memory under pressure: Secondary-task effects on contextual cueing of visual search. Journal of Vision, 13(13), 6, 1–15. 3. Annac, E., Pointner, M., Khader, P. H., Müller, H. J., Zang, X., & Geyer, T. (2019). Recognition of incidentally learned visual search arrays is supported by fixational eye movements. Journal of Experimental Psychology: Learning, Memory, and Cognition, 45(12), 2147–2164. 4. Assumpção, L., Shi, Z., Zang, X., Müller, H. J., & Geyer, T. (2015). Contextual cueing: Implicit memory of tactile context facilitates tactile search. Attention, Perception, & Psychophysics, 77(4), 1212–1222. 5. Assumpção, L., Shi, Z., Zang, X., Müller, H. J., & Geyer, T. (2018). Contextual cueing of tactile search is coded in an anatomical reference frame. Journal of Experimental Psychology: Human Perception and Performance, 44(4), 566–577.
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|