Author:
Chen Juan,Sperandio Irene,Henry Molly J.,Goodale Melvyn A
Abstract
AbstractOur visual system affords a distance-invariant percept of object size by integrating retinal image size with viewing distance (size constancy). Single-unit studies with animals have shown that real changes in distance can modulate the firing rate of neurons in primary visual cortex and even subcortical structures, which raises an intriguing possibility that the required integration for size constancy may occur in the initial visual processing in V1 or even earlier. In humans, however, EEG and brain imaging studies have typically manipulated the apparent (not real) distance of stimuli using pictorial illusions, in which the cues to distance are sparse and not congruent. Here, we physically moved the monitor to different distances from the observer, a more ecologically valid paradigm that emulates what happens in everyday life. Using this paradigm in combination with electroencephalography (EEG), we were able for the first time to examine how the computation of size constancy unfolds in real time under real-world viewing conditions. We showed that even when all distance cues were available and congruent, size constancy took about 150 ms to emerge in the activity of visual cortex. The 150-ms interval exceeds the time required for the visual signals to reach V1, but is consistent with the time typically associated with later processing within V1 or recurrent processing from higher-level visual areas. Therefore, this finding provides unequivocal evidence that size constancy does not occur during the initial signal processing in V1 or earlier, but requires subsequent processing, just like any other feature binding mechanisms.
Publisher
Cold Spring Harbor Laboratory