Abstract
SummaryThe brain interprets sensory inputs to guide behavior, but behavior disrupts sensory inputs. In primates, saccadic eye movements displace visual images on the retina and yet the brain perceives visual stability, a process called active vision. We studied whether active vision is Bayesian. Humans and monkeys reported whether an image moved during saccades. We tested whether they used prior expectations to account for sensory uncertainty in a Bayesian manner. For continuous judgments, subjects were Bayesian. For categorical judgments, they were anti-Bayesian for uncertainty due to external, image noise but Bayesian for uncertainty due to internal, motor-driven noise. A discriminative learning model explained the anti-Bayesian effect. Therefore, active vision uses both Bayesian and discriminative models depending on task requirements (continuous vs. categorical) and the source of uncertainty (image noise vs. motor-driven noise), suggesting that active perceptual mechanisms are governed by the interaction of both models.
Publisher
Cold Spring Harbor Laboratory
Reference84 articles.
1. Barlow, H. B. (1961). Possible principles underlying the transformation of sensory messages. Sensory communication, 1(01).
2. Neural Coding
3. A relationship between behavioral choice and the visual responses of neurons in macaque MT
4. Murphy, K. P. (2013). Machine learning: a probabilistic perspective. Cambridge, Mass. [u.a.]: MIT Press.
5. On discriminative vs. generative classifiers: A comparison of logistic regression and naive bayes;Advances in Neural Information Processing Systems,2002