Author:
Yang Lei,Zhang Fuhai,Zhu Jingbin,Fu Yili
Abstract
Purpose
The accuracy and reliability of upper limb motion assessment have received great attention in the field of rehabilitation. Grasping test is widely carried out for motion assessment, which requires patients to grasp objects and move them to target place. The traditional assessments test the upper limb motion ability by therapists, which mainly relies on experience and lacks quantitative indicators. This paper aims to propose a deep learning method based on the vision system of our upper limb rehabilitation robot to recognize the motion trajectory of rehabilitation target objects automatically and quantitatively assess the upper limb motion in the grasping test.
Design/methodology/approach
To begin with, an SRF network is designed to recognize rehabilitation target objects grasped in assessment tests. Moreover, the upper limb motion trajectory is calculated through the motion of objects’ central positions. After that, a GAE network is designed to analyze the motion trajectory which reflects the motion of upper limb. Finally, based on the upper limb rehabilitation exoskeleton platform, the upper limb motion assessment tests are carried out to show the accuracy of both object recognition of SRF network and motion assessment of GAE network. The results including object recognition, trajectory calculation and deviation assessment are given with details.
Findings
The performance of the proposed networks is validated by experiments that are developed on the upper limb rehabilitation robot. It is implemented by recognizing rehabilitation target objects, calculating the motion trajectory and grading the upper limb motion performance. It illustrates that the networks, including both object recognition and trajectory evaluation, can grade the upper limb motion functionn accurately, where the accuracy is above 95.0% in different grasping tests.
Originality/value
A novel assessment method of upper limb motion is proposed and verified. According to the experimental results, the accuracy can be remarkably enhanced, and the stability of the results can be improved, which provide more quantitative indicators for further application of upper limb motion assessment.