Affiliation:
1. ETH Zürich, Switzerland
2. University of St. Gallen, Switzerland
3. Singapore Management University, Singapore
4. Karlsruhe Institute of Technology, Germany
Abstract
An empathetic car that is capable of reading the driver's emotions has been envisioned by many car manufacturers. Emotion inference enables in-vehicle applications to improve driver comfort, well-being, and safety. Available emotion inference approaches use physiological, facial, and speech-related data to infer emotions during driving trips. However, existing solutions have two major limitations: Relying on sensors that are not built into the vehicle restricts emotion inference to those people leveraging corresponding devices (e.g., smartwatches). Relying on modalities such as facial expressions and speech raises privacy concerns. By contrast, researchers in mobile health have been able to infer affective states (e.g., emotions) based on behavioral and contextual patterns decoded in available sensor streams, e.g., obtained by smartphones. We transfer this rationale to an in-vehicle setting by analyzing the feasibility of inferring driver emotions by passively interpreting the data streams of the control area network (CAN-bus) and the traffic context (inferred from the front-view camera). Therefore, our approach does not rely on particularly privacy-sensitive data streams such as the driver facial video or driver speech, but is built based on existing CAN-bus data and traffic information, which is available in current high-end or future vehicles. To assess our approach, we conducted a four-month field study on public roads covering a variety of uncontrolled daily driving activities. Hence, our results were generated beyond the confines of a laboratory environment. Ultimately, our proposed approach can accurately recognise drivers' emotions and achieve comparable performance as the medical-grade physiological sensor-based state-of-the-art baseline method.
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Networks and Communications,Hardware and Architecture,Human-Computer Interaction
Reference93 articles.
1. Audi AG. 2018. Audi Elaine. https://www.audi.com/en/experience-audi/models-and-technology/concept-cars/audi-elaine.html Accessed: 2021-04-23. Audi AG. 2018. Audi Elaine. https://www.audi.com/en/experience-audi/models-and-technology/concept-cars/audi-elaine.html Accessed: 2021-04-23.
2. Are Emotions Natural Kinds?
3. Michael Braun Florian Weber and Florian Alt. 2020. Affective Automotive User Interfaces - Reviewing the State of Emotion Regulation in the Car. arXiv:cs.HC/2003.13731 Michael Braun Florian Weber and Florian Alt. 2020. Affective Automotive User Interfaces - Reviewing the State of Emotion Regulation in the Car. arXiv:cs.HC/2003.13731
4. Outliers in Smartphone Sensor Data Reveal Outliers in Daily Happiness
Cited by
19 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献