Affiliation:
1. University of Toronto, Canada
Abstract
This chapter presents a real-time robust affect classification methodology for socially interactive robots engaging in one-on-one human-robot-interactions (HRI). The methodology is based on identifying a person’s body language in order to determine how accessible he/she is to a robot during the interactions. Static human body poses are determined by first identifying individual body parts and then utilizing an indirect 3D human body model that is invariant to different body shapes and sizes. The authors implemented and tested their technique using two different sensory systems in social HRI scenarios to motivate its robustness for the proposed application. In particular, the experiments consisted of integrating the proposed body language recognition and affect classification methodology with imaging-based sensory systems onto the human-like socially interactive robot Brian 2.0 in order for the robot to recognize affective body language during one-on-one interactions.
Reference42 articles.
1. The communication of friendly and hostile attitudes by verbal and non-verbal signals
2. Line of sight robot navigation toward a moving goal
3. Bonato, V., Sanches, A. K., Fernandes, M. M., Cardoso, J., Simoes, E., & Marques, E. (2004). A real time gesture recognition system for mobile robots. International Conference on Informatics in Control, Automation and Robotics (pp. 207-214).
4. Boucenna, S., Gaussier, P., Andry, P., & Hafemeister, L. (2010). Imitation as a communication tool for online facial expression learning and recognition. 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 5323-5328).
5. Bouguet, J. Y. (2010). Matlab calibration toolbox. Retrieved from http://www.vision.caltech.edu/bouguetj/calib_doc/