Affiliation:
1. College of Design, National Taipei University of Technology, Taipei 10608, Taiwan
2. School of Design Arts, Xiamen University of Technology, Xiamen 361024, China
Abstract
In complex environments, users frequently need to manage multiple tasks simultaneously, which poses significant challenges for user interface design. For instance, when driving, users must maintain continuous visual attention on the road ahead while also monitoring rearview mirrors and performing shoulder checks. These multitasking scenarios present substantial design challenges in effectively guiding users. To address these challenges, we focus on investigating and designing visual and haptic guidance systems to augment users’ performance. We initially propose the use of visual guidance, specifically employing a dynamic arrow as a guidance technique. Our evaluation shows that dynamic arrows significantly expedite both reaction and selection times. We further introduce and evaluate haptic feedback, which users perceive as more salient than visual guidance, leading to quicker responses when switching from primary to secondary tasks. This allows users to maintain visual attention on the primary task while simultaneously responding effectively to haptic cues. Our findings suggest that multimodal guidance, especially haptic guidance, can enhance both reaction time and user experience in dual-task environments, offering promising practical implications and guidelines for designing more user-friendly interfaces and systems.
Reference66 articles.
1. Augmented reality and image overlay navigation with OsiriX in laparoscopic and robotic surgery: Not only a matter of fashion;Pugin;J. Hepato Biliary Pancreat. Sci.,2011
2. Renner, P., and Pfeiffer, T. (2017, January 18–19). Attention guiding techniques using peripheral vision and eye tracking for feedback in augmented-reality-based assistance systems. Proceedings of the 2017 IEEE Symposium on 3D User Interfaces, Los Angeles, CA, USA.
3. Schinke, T., Henze, N., and Boll, S. (2012, January 21–24). Visualization of off-screen objects in mobile augmented reality. Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, San Francisco, CA, USA.
4. Gruenefeld, U., Lange, D., Hammer, L., Boll, S., and Heuten, W. (2018, January 6–8). FlyingARrow: Pointing Towards Out-of-View Objects on Augmented Reality Devices. Proceedings of the 7th ACM International Symposium on Pervasive Displays, Munich, Germany.
5. Visual Search in Augmented Reality: Effect of Target Cue Type and Location;Warden;Proc. Hum. Factors Ergon. Soc. Annu. Meet.,2022