Abstract
The accurate teleoperation of robotic devices requires simple, yet intuitive and reliable control interfaces. However, current human–machine interfaces (HMIs) often fail to fulfill these characteristics, leading to systems requiring an intensive practice to reach a sufficient operation expertise. Here, we present a systematic methodology to identify the spontaneous gesture-based interaction strategies of naive individuals with a distant device, and to exploit this information to develop a data-driven body–machine interface (BoMI) to efficiently control this device. We applied this approach to the specific case of drone steering and derived a simple control method relying on upper-body motion. The identified BoMI allowed participants with no prior experience to rapidly master the control of both simulated and real drones, outperforming joystick users, and comparing with the control ability reached by participants using the bird-like flight simulator Birdly.
Funder
Schweizerischer Nationalfonds zur Förderung der Wissenschaftlichen Forschung
Fondation Bertarelli
Publisher
Proceedings of the National Academy of Sciences
Reference55 articles.
1. Corliss WR Johnsen EG (1968) Teleoperator controls an AEC-NASA technology survey. Available at https://ntrs.nasa.gov/search.jsp?R=19690012116. Accessed March 6, 2017.
2. Robots in the nuclear industry: A review of technologies and applications;Bogue;Ind Robot Int J,2011
3. Wall-climbing robot for inspection in nuclear power plants;Briones,1994
4. Human–Robot Interaction in Rescue Robotics
5. Burke JL Murphy RR (2004) Human-robot interaction in USAR technical search: Two heads are better than one. RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE, New York), pp 307–312.
Cited by
55 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献