Abstract
When humans navigate through complex environments, they coordinate gaze and steering to sample the visual information needed to guide movement. Gaze and steering behavior have been extensively studied in the context of automobile driving along a winding road, leading to accounts of movement along well-defined paths over flat, obstacle-free surfaces. However, humans are also capable of visually guiding self-motion in environments that are cluttered with obstacles and lack an explicit path. An extreme example of such behavior occurs during first-person view drone racing, in which pilots maneuver at high speeds through a dense forest. In this study, we explored the gaze and steering behavior of skilled drone pilots. Subjects guided a simulated quadcopter along a racecourse embedded within a custom-designed forest-like virtual environment. The environment was viewed through a head-mounted display equipped with an eye tracker to record gaze behavior. In two experiments, subjects performed the task in multiple conditions that varied in terms of the presence of obstacles (trees), waypoints (hoops to fly through), and a path to follow. Subjects often looked in the general direction of things that they wanted to steer toward, but gaze fell on nearby objects and surfaces more often than on the actual path or hoops. Nevertheless, subjects were able to perform the task successfully, steering at high speeds while remaining on the path, passing through hoops, and avoiding collisions. In conditions that contained hoops, subjects adapted how they approached the most immediate hoop in anticipation of the position of the subsequent hoop. Taken together, these findings challenge existing models of steering that assume that steering is tightly coupled to where actors look. We consider the study’s broader implications as well as limitations, including the focus on a small sample of highly skilled subjects and inherent noise in measurement of gaze direction.
Funder
Directorate for Social, Behavioral and Economic Sciences
Office of Naval Research Global
Publisher
Public Library of Science (PLoS)
Reference27 articles.
1. Vision and action;MM Hayhoe;Annual Review of Vision Science,2017
2. Control of gaze in natural environments: effects of rewards and costs, uncertainty and memory in target selection;MM Hayhoe;Interface Focus,2018
3. Land M, Tatler B. Looking and acting: vision and eye movements in natural behaviour. Oxford University Press; 2009
4. Visuomotor control, eye movements, and steering: A unified approach for incorporating feedback, feedforward, and internal models;O Lappi;Psychological Bulletin,2018
5. Saccades to future ball location reveal memory-based prediction in a virtual-reality interception task;G Diaz;Journal of Vision,2013
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献