Upper limb exercise with physical and virtual robots: Visual sensitivity affects task performance
-
Published:2021-01-01
Issue:1
Volume:12
Page:199-213
-
ISSN:2081-4836
-
Container-title:Paladyn, Journal of Behavioral Robotics
-
language:en
-
Short-container-title:
Author:
Chevalier Pauline1, Vasco Valentina2, Willemse Cesco1, De Tommaso Davide1, Tikhanoff Vadim2, Pattacini Ugo2, Wykowska Agnieszka1
Affiliation:
1. Social Cognition in Human-Robot Interaction, Istituto Italiano di Tecnologia (IIT) , Genoa , 16152 , Italy 2. iCub, Istituto Italiano di Tecnologia (IIT) , Genoa , 16163 , Italy
Abstract
Abstract
We investigated the influence of visual sensitivity on the performance of an imitation task with the robot R1 in its virtual and physical forms. Virtual and physical embodiments offer different sensory experience to the users. As all individuals respond differently to their sensory environment, their sensory sensitivity may play a role in the interaction with a robot. Investigating how sensory sensitivity can influence the interactions appears to be a helpful tool to evaluate and design such interactions. Here we asked 16 participants to perform an imitation task, with a virtual and a physical robot under conditions of full and occluded visibility, and to report the strategy they used to perform this task. We asked them to complete the Sensory Perception Quotient questionnaire. Sensory sensitivity in vision predicted the participants’ performance in imitating the robot’s upper limb movements. From the self-report questionnaire, we observed that the participants relied more on visual sensory cues to perform the task with the physical robot than on the virtual robot. From these results, we propose that a physical embodiment enables the user to invest a lower cognitive effort when performing an imitation task over a virtual embodiment. The results presented here are encouraging that following this line of research is suitable to improve and evaluate the effects of the physical and virtual embodiment of robots for applications in healthy and clinical settings.
Publisher
Walter de Gruyter GmbH
Subject
Behavioral Neuroscience,Artificial Intelligence,Cognitive Neuroscience,Developmental Neuroscience,Human-Computer Interaction
Reference31 articles.
1. S. Andrist
,
B. Mutlu
, and
A. Tapus
, “Look like me: Matching robot personality via gaze to increase motivation,” in Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, New York, NY, USA, 2015, pp. 3603–3612, https://doi.org/10.1145/2702123.2702592
. 2. L. Robert
, “Personality in the human robot interaction literature: A review and brief critique,” in Proceedings of the 24th Americas Conference on Information Systems, Aug 16–18, New Orleans, LA. Available at SSRN: https://ssrn.com/abstract=3308191. [Accessed: Sep. 18, 2019]. 3. P. Chevalier
,
G. Raiola
,
J. C. Martin
,
B. Isableu
,
C. Bazile
, and
A. Tapus
, “Do sensory preferences of children with autism impact an imitation task with a robot?,” in Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, New York, NY, USA, 2017, pp. 177–186, https://doi.org/10.1145/2909824.3020234
. 4. R. Agrigoroaie
and
A. Tapus
, “Influence of robot’s interaction style on performance in a Stroop task,” in ICSR 2017: Social Robotics, Lecture Notes in Computer Science, vol. 10652,
A. Kheddar
et al. (Eds.), Cham: Springer, 2017, pp. 95–104, https://doi.org/10.1007/978-3-319-70022-9_10 5. D. Shepherd
,
M. Heinonen-Guzejev
,
M. J. Hautus
, and
K. Heikkilä
, “Elucidating the relationship between noise sensitivity and personality,” Noise Health, vol. 17, no. 76, pp. 165–171, 2015, https://doi.org/10.4103/1463-1741.155850
.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|