Implementing mobile eye tracking in psychological research: A practical guide
-
Published:2024-08-15
Issue:
Volume:
Page:
-
ISSN:1554-3528
-
Container-title:Behavior Research Methods
-
language:en
-
Short-container-title:Behav Res
Author:
Fu XiaoxueORCID, Franchak John M., MacNeill Leigha A., Gunther Kelley E., Borjon Jeremy I., Yurkovic-Harding Julia, Harding Samuel, Bradshaw Jessica, Pérez-Edgar Koraly E.
Abstract
AbstractEye tracking provides direct, temporally and spatially sensitive measures of eye gaze. It can capture visual attention patterns from infancy through adulthood. However, commonly used screen-based eye tracking (SET) paradigms are limited in their depiction of how individuals process information as they interact with the environment in “real life”. Mobile eye tracking (MET) records participant-perspective gaze in the context of active behavior. Recent technological developments in MET hardware enable researchers to capture egocentric vision as early as infancy and across the lifespan. However, challenges remain in MET data collection, processing, and analysis. The present paper aims to provide an introduction and practical guide to starting researchers in the field to facilitate the use of MET in psychological research with a wide range of age groups. First, we provide a general introduction to MET. Next, we briefly review MET studies in adults and children that provide new insights into attention and its roles in cognitive and socioemotional functioning. We then discuss technical issues relating to MET data collection and provide guidelines for data quality inspection, gaze annotations, data visualization, and statistical analyses. Lastly, we conclude by discussing the future directions of MET implementation. Open-source programs for MET data quality inspection, data visualization, and analysis are shared publicly.
Funder
National Institute of Child Health and Human Development National Science Foundation National Institute of Mental Health James S. McDonell Foundation University of South Carolina
Publisher
Springer Science and Business Media LLC
Reference129 articles.
1. Abney, D. H., Suanda, S. H., Smith, L. B., & Yu, C. (2020). What are the building blocks of parent–infant coordinated attention in free-flowing interaction? Infancy, 25(6), 871–887. https://doi.org/10.1111/infa.12365 2. Ballard, D. H., Hayhoe, M. M., Pook, P. K., & Rao, R. P. N. (1997). Deictic codes for the embodiment of cognition. Behavioral and Brain Sciences, 20(4), 723–742. https://doi.org/10.1017/S0140525X97001611 3. Bambach, S., Smith, L. B., Crandall, D. J., & Yu, C. (2016). Objects in the center: How the infant’s body constrains infant scenes. Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob). IEEE. 4. Bambach, S., Crandall, D., Smith, L., & Yu, C. (2018). Toddler-inspired visual object learning. Advances in Neural Information Processing Systems, 31. 5. Benjamins, J. S., Hessels, R. S., & Hooge, I. T. C. (2018). GazeCode: Open-source software for manual mapping of mobile eye-tracking data. In: 2018 ACM Symposium on Eye Tracking Research & Applications (Etra 2018). https://doi.org/10.1145/3204493.3204568
|
|