Affiliation:
1. AudioLab, School of Physics Engineering and Technology, University of York, York YO10 5DD, UK
2. Bang & Olufsen a/s, 7600 Struer, Denmark
Abstract
Camera-based solutions can be a convenient means of collecting physiological measurements indicative of psychological responses to stimuli. However, the low illumination playback conditions commonly associated with viewing screen-based media oppose the bright conditions recommended for accurately recording physiological data with a camera. A study was designed to determine the feasibility of obtaining physiological data, for psychological insight, in illumination conditions representative of real world viewing experiences. In this study, a novel method was applied for testing a first-of-its-kind system for measuring both heart rate and facial actions from video footage recorded with a single discretely placed camera. Results suggest that conditions representative of a bright domestic setting should be maintained when using this technology, despite this being considered a sub-optimal playback condition. Further analyses highlight that even within this bright condition, both the camera-measured facial action and heart rate data contained characteristic errors. In future research, the influence of these performance issues on psychological insights may be mitigated by reducing the temporal resolution of the heart rate measurements and ignoring fast and low-intensity facial movements.
Funder
UK Arts and Humanities Research Council (AHRC) XR Stories Creative Industries Cluster project
University of York funded PhD studentship
Bang & Olufsen, Denmark
Subject
Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science
Reference79 articles.
1. Baltrusaitis, T. (2023, June 26). OpenFace 2.2.0 GitHub. Available online: https://github.com/TadasBaltrusaitis/OpenFace.
2. Phuselab (2023, June 26). PyVHR GitHub. Available online: https://github.com/phuselab/pyVHR/blob/master/notebooks/.
3. pyVHR: A Python framework for remote photoplethysmography;Boccignone;PeerJ Comput. Sci.,2022
4. Yang, X., Li, Y., and Lyu, S. (2019, January 12–17). Exposing deep fakes using inconsistent head poses. Proceedings of the 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2019), Brighton, UK.
5. Ofcom (2023, July 07). Media Nations: UK 2022—Ofcom, Available online: https://www.ofcom.org.uk/.
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献