Abstract
Precisely imitating human motions in real-time poses a challenge for the robots due to difference in their physical structures. This paper proposes a human–computer interaction method for remotely manipulating life-size humanoid robots with a new metrics for evaluating motion similarity. First, we establish a motion capture system to acquire the operator’s motion data and retarget it to the standard bone model. Secondly, we develop a fast mapping algorithm, by mapping the BVH (BioVision Hierarchy) data collected by the motion capture system to each joint motion angle of the robot to realize the imitated motion control of the humanoid robot. Thirdly, a DTW (Dynamic Time Warping)-based trajectory evaluation method is proposed to quantitatively evaluate the difference between robot trajectory and human motion, and meanwhile, visualization terminals render it more convenient to make comparisons between two different but simultaneous motion systems. We design a complex gesture simulation experiment to verify the feasibility and real-time performance of the control method. The proposed human-in-the-loop imitation control method addresses a prominent non-isostructural retargeting problem between human and robot, enhances robot interaction capability in a more natural way, and improves robot adaptability to uncertain and dynamic environments.
Funder
Ministry of Science and Technology
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Cited by
12 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献