Author:
Ibrahim Memunat A.,Assaad Zena,Williams Elizabeth
Abstract
Intelligent highly-automated systems (HASs) are increasingly being created and deployed at scale with a broad range of purposes and operational environments. In uncertain or safety-critical environments, HASs are frequently designed to seamlessly co-operate with humans, thus, forming human-machine teams (HMTs) to achieve collective goals. Trust plays an important role in this dynamic: humans need to be able to develop an appropriate level of trust in their HAS teammate(s) to form an HMT capable of safely and effectively working towards goal completion. Using Autonomous Ground Vehicles (AGVs) as an example of an HAS used in dynamic social contexts, we explore interdependent teaming and communication between humans and AGVs in different contexts and examine the role of trust and communication in these teams. Drawing on lessons from the AGV example for the design of an HAS used for an HMT more broadly, we argue that trust is experienced and built differently in different contexts, necessitating context-specific approaches to designing for trust in such systems.
Subject
Physical and Theoretical Chemistry,General Physics and Astronomy,Mathematical Physics,Materials Science (miscellaneous),Biophysics
Reference34 articles.
1. Trust in automation: Designing for appropriate reliance;Lee;hfes,2004
2. Trust in automation: Integrating empirical evidence on factors that influence trust;Hoff;Hum Factors,2015
3. A model for types and levels of human interaction with automation;Parasuraman;IEEE Trans Syst Man Cybern A,2000
4. Human factors in highly automated systems2022
5. Human-AI teaming: State-of-the-Art and research needs
National Academies of SciencesE
2021
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献