Affiliation:
1. Department of Biomedical Engineering, Boston University; and
2. Department of Neurology and Radiology, Harvard Medical School, Boston, Massachusetts
Abstract
Segmentation of the visual scene into relevant object components is a fundamental process for successfully interacting with our surroundings. Many visual cues, including motion and binocular disparity, support segmentation, yet the mechanisms using these cues are unclear. We used a psychophysical motion discrimination task in which noise dots were displaced in depth to investigate the role of segmentation through disparity cues in visual motion stimuli ( experiment 1). We found a subtle, but significant, bias indicating that near disparity noise disrupted the segmentation of motion more than equidistant far disparity noise. A control experiment showed that the near-far difference could not be attributed to attention ( experiment 2). To account for the near-far bias, we constructed a biologically constrained model using recordings from neurons in the middle temporal area (MT) to simulate human observers' performance on experiment 1. Performance of the model of MT neurons showed a near-disparity skew similar to that shown by human observers. To isolate the cause of the skew, we simulated performance of a model containing units derived from properties of MT neurons, using phase-modulated Gabor disparity tuning. Using a skewed-normal population distribution of preferred disparities, the model reproduced the elevated motion discrimination thresholds for near-disparity noise, whereas a skewed-normal population of phases (creating individually asymmetric units) did not lead to any performance skew. Results from the model suggest that the properties of neurons in area MT are computationally sufficient to perform disparity segmentation during motion processing and produce similar disparity biases as those produced by human observers.
Publisher
American Physiological Society
Subject
Physiology,General Neuroscience
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献