Author:
Watanabe Yoshiaki,Nagahama Kotaro,Yamazaki Kimitoshi,Okada Kei,Inaba Masayuki
Abstract
AbstractThis paper describes a system integration for a life-sized robot working at a kitchen. On cooking tasks, there should be various tools and foods, and cooking table may have reflective surface with blots and scratch. Recognition functions should be robust to noises derived from them. As other problems, cooking behaviors impose motion sequences by using whole body of the robot. For instance, while cutting a vegetable, the robot has to hold one hand against the vegetable even if another hand with a knife should be moved for the cutting. This motion requires to consider full articulation of the robot simultaneously. That is, we have difficulties against both recognition and motion generation. In this paper we propose recognition functions that are to detect kitchen tools such as containers and cutting boards. These functions are improved to overcome the influence of reflective surface, and combination shape model with task knowledge is also proposed. On the other hand, we pointed out the importance of the use of torso joints while dual arm manipulation. Our approach enables the robot to keep manipulability of both arms and viewing field of a head. Based on these products, we also introduce an integrated system incorporating recognition modules and motion generation modules. The effectiveness of the system was proven through some cooking applications.
Subject
Behavioral Neuroscience,Artificial Intelligence,Cognitive Neuroscience,Developmental Neuroscience,Human-Computer Interaction
Cited by
12 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献