Multi-Dimensional Evaluation of an Augmented Reality Head-Mounted Display User Interface for Controlling Legged Manipulators

Author:

Chacón Quesada Rodrigo1ORCID,Demiris Yiannis1

Affiliation:

1. Personal Robotics Laboratory, Imperial College London, London, United Kingdom

Abstract

Controlling assistive robots can be challenging for some users, especially those lacking relevant experience. Augmented Reality (AR) User Interfaces (UIs) have the potential to facilitate this task. Although extensive research regarding legged manipulators exists, comparatively little is on their UIs. Most existing UIs leverage traditional control interfaces such as joysticks, Hand-Held (HH) controllers and 2D UIs. These interfaces not only risk being unintuitive, thus discouraging interaction with the robot partner, but also draw the operator’s focus away from the task and towards the UI. This shift in attention raises additional safety concerns, particularly in potentially hazardous environments where legged manipulators are frequently deployed. Moreover, traditional interfaces limit the operators’ availability to use their hands for other tasks. Towards overcoming these limitations, in this article, we provide a user study comparing an AR Head-Mounted Display (HMD) UI we developed for controlling a legged manipulator against off-the-shelf control methods for such robots. This user study involved 27 participants and 135 trials, from which we gathered over 405 completed questionnaires. These trials involved multiple navigation and manipulation tasks with varying difficulty levels using a Boston Dynamics’s Spot, a 7 df Kinova robot arm and a Robotiq 2F-85 gripper that we integrated into a legged manipulator. We made the comparison between UIs across multiple dimensions relevant to a successful human–robot interaction. These dimensions include cognitive workload, technology acceptance, fluency, system usability, immersion and trust. Our study employed a factorial experimental design with participants undergoing five different conditions, generating longitudinal data. Due to potential unknown distributions and outliers in such data, using parametric methods for its analysis is questionable, and while non-parametric alternatives exist, they may lead to reduced statistical power. Therefore, to analyse the data that resulted from our experiment, we chose Bayesian data analysis as an effective alternative to address these limitations. Our results show that AR UIs can outpace HH-based control methods and reduce the cognitive requirements when designers include hands-free interactions and cognitive offloading principles into the UI. Furthermore, the use of the AR UI together with our cognitive offloading feature resulted in higher usability scores and significantly higher fluency and Technology Acceptance Model scores. Regarding immersion, our results revealed that the response values for the AR Immersion questionnaire associated with the AR UI are significantly higher than those associated with the HH UI, regardless of the main interaction method with the former, i.e., hand gestures or cognitive offloading. Derived from the participants’ qualitative answers, we believe this is due to a combination of factors, of which the most important is the free use of the hands when using the HMD, as well as the ability to see the real environment without the need to divert their attention to the UI. Regarding trust, our findings did not display discernible differences in reported trust scores across UI options. However, during the manipulation phase of our user study, where participants were given the choice to select their preferred UI, they consistently reported higher levels of trust compared to the navigation category. Moreover, there was a drastic change in the percentage of participants that selected the AR UI for completing this manipulation stage after incorporating the cognitive offloading feature. Thus, trust seems to have mediated the use and non-use of the UIs in a dimension different from the ones considered in our study, i.e., delegation and reliance. Therefore, our AR HMD UI for the control of legged manipulators was found to improve human–robot interaction across several relevant dimensions, underscoring the critical role of UI design in the effective and trustworthy utilisation of robotic systems.

Funder

UKRI

Publisher

Association for Computing Machinery (ACM)

Reference91 articles.

1. Usability Guidelines for the Design of Robot Teleoperation: A Taxonomy

2. Gesture-based interaction with voice feedback for a tour-guide robot

3. Anybotics. 2023. How to Hire an Industrial Inspection Robot. Retrieved from https://www.anybotics.com/news/how-to-hire-an-industrial-inspection-robot-part-1/. Accessed: 2024-01.

4. Apple. 2021. More to Explore with ARKit 5. Retrieved from https://developer.apple.com/augmented-reality/arkit/. Accessed: 2021-09.

5. Questionnaire experience and the hybrid System Usability Scale: Using a novel concept to evaluate a new instrument

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3