Affiliation:
1. The University of Sydney, Australia Australian Research Council Centre of Excellence in Vision Science
2. The University of York, UK The University of Sydney, Australia Australian Research Council Centre of Excellence in Vision Science UNSW Sydney, Australia
3. UNSW Sydney, Australia The University of Sydney, Australia Australian Research Council Centre of Excellence in Vision Science
Abstract
Identifying the spatial and temporal characteristics of visual feature binding is a remaining challenge in the science of perception. Within the feature-binding literature, disparate findings have suggested the existence of more than one feature-binding mechanism with differing temporal resolutions. For example, one surprising result is that temporal alternations between two different feature pairings of colour and motion (e.g., orange dots moving left with blue dots moving right) support accurate conjunction discrimination at alternation frequencies of around 10 Hz and greater. However, at lower alternation frequencies around 5 Hz, conjunction discrimination falls to chance. To further investigate this effect, we present two experiments that probe the stimulus characteristics that facilitate or impede feature binding. Using novel manipulations of random dot kinematograms, we identify that facilitating surface representations through temporal integration can enable accurate conjunction discrimination at both intermediate and high alternation frequencies. We also offer a neurally plausible evidence accumulator model to describe these results, removing the need to suggest multiple binding mechanisms acting at different timescales. In effect, we propose a single, flexible binding process, whereby the relatively low temporal resolution for binding features can be circumvented by extracting them from rapidly formed and persistent surface representations.
Subject
Artificial Intelligence,Sensory Systems,Experimental and Cognitive Psychology,Ophthalmology
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献