Affiliation:
1. Graduate School of Automotive Engineering, Kookmin University, Seoul 02707, Republic of Korea
Abstract
AbstractInteracting with an in-vehicle system through a central console is known to induce visual and biomechanical distractions, thereby delaying the danger recognition and response times of the driver and significantly increasing the risk of an accident. To address this problem, various hand gestures have been developed. Although such gestures can reduce visual demand, they are limited in number, lack passive feedback, and can be vague and imprecise, difficult to understand and remember, and culture-bound. To overcome these limitations, we developed a novel on-wheel finger spreading gestural interface combined with a head-up display (HUD) allowing the user to choose a menu displayed in the HUD with a gesture. This interface displays audio and air conditioning functions on the central console of a HUD and enables their control using a specific number of fingers while keeping both hands on the steering wheel. We compared the effectiveness of the newly proposed hybrid interface against a traditional tactile interface for a central console using objective measurements and subjective evaluations regarding both the vehicle and driver behaviour. A total of 32 subjects were recruited to conduct experiments on a driving simulator equipped with the proposed interface under various scenarios. The results showed that the proposed interface was approximately 20% faster in emergency response than the traditional interface, whereas its performance in maintaining vehicle speed and lane was not significantly different from that of the traditional one.
Funder
National Research Foundation of Korea
Ministry of Education
Publisher
Oxford University Press (OUP)
Subject
Computational Mathematics,Computer Graphics and Computer-Aided Design,Human-Computer Interaction,Engineering (miscellaneous),Modelling and Simulation,Computational Mechanics
Reference57 articles.
1. Gesturing on the steering wheel: A user-elicited taxonomy;Angelini,2014
2. Opportunistic synergy: A classifier fusion engine for micro-gesture recognition;Angelini,2013
3. You can touch, but you can't look: Interacting with in-vehicle systems;Bach,2008
4. Interacting with in-vehicle systems: Understanding, measuring, and evaluating attention;Bach,2009
5. Safety and usability of speech interfaces for in-vehicle tasks while driving: A brief literature review;Barón,2006
Cited by
9 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献