Abstract
Amputation of the upper limb significantly hinders the ability of patients to perform activities of daily living. To address this challenge, this paper introduces a novel approach that combines non-invasive methods, specifically Electroencephalography (EEG) and Electromyography (EMG) signals, with advanced machine learning techniques to recognize upper limb movements. The objective is to improve the control and functionality of prosthetic upper limbs through effective pattern recognition. The proposed methodology involves the fusion of EMG and EEG signals, which are processed using time-frequency domain feature extraction techniques. This enables the classification of seven distinct hand and wrist movements. The experiments conducted in this study utilized the Binary Grey Wolf Optimization (BGWO) algorithm to select optimal features for the proposed classification model. The results demonstrate promising outcomes, with an average classification accuracy of 93.6% for three amputees and five individuals with intact limbs. The accuracy achieved in classifying the seven types of hand and wrist movements further validates the effectiveness of the proposed approach. By offering a non-invasive and reliable means of recognizing upper limb movements, this research represents a significant step forward in biotechnical engineering for upper limb amputees. The findings hold considerable potential for enhancing the control and usability of prosthetic devices, ultimately contributing to the overall quality of life for individuals with upper limb amputations.
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献