Affiliation:
1. Faculty of Technology, CITEC, Bielefeld University
Abstract
While humans are used to reason about other humans’ behavior, they are not readily able to understand the decision processes of artificial agents. This can be harmful in human-robot interaction (HRI) settings where a user may suspect erroneous or, even worse, intentionally non-cooperative behavior, resulting in reduced acceptance of the robot. In order to mitigate such negative effects, autonomous robots may be equipped with the ability to adequately explain their behavior. To that end, a robot is required to have the ability to (1) robustly detect a user’s need for explanation and (2) identify the situation-specific nature of the explanation need. Further it needs to be endowed with (3) communicative capabilities in order to deliver suitable explanations and ensure sufficient understanding. This extended abstract presents recent work towards endowing a social robot with such qualities and discusses how robots can meet users’ explanation needs more adequately.