Author:
Zhang Xiaodong,Li Hanzhe,Dong Runlin,Lu Zhufeng,Li Cunxin
Abstract
The electroencephalogram (EEG) and surface electromyogram (sEMG) fusion has been widely used in the detection of human movement intention for human–robot interaction, but the internal relationship of EEG and sEMG signals is not clear, so their fusion still has some shortcomings. A precise fusion method of EEG and sEMG using the CNN-LSTM model was investigated to detect lower limb voluntary movement in this study. At first, the EEG and sEMG signal processing of each stage was analyzed so that the response time difference between EEG and sEMG can be estimated to detect lower limb voluntary movement, and it can be calculated by the symbolic transfer entropy. Second, the data fusion and feature of EEG and sEMG were both used for obtaining a data matrix of the model, and a hybrid CNN-LSTM model was established for the EEG and sEMG-based decoding model of lower limb voluntary movement so that the estimated value of time difference was about 24 ∼ 26 ms, and the calculated value was between 25 and 45 ms. Finally, the offline experimental results showed that the accuracy of data fusion was significantly higher than feature fusion-based accuracy in 5-fold cross-validation, and the average accuracy of EEG and sEMG data fusion was more than 95%; the improved average accuracy for eliminating the response time difference between EEG and sEMG was about 0.7 ± 0.26% in data fusion. In the meantime, the online average accuracy of data fusion-based CNN-LSTM was more than 87% in all subjects. These results demonstrated that the time difference had an influence on the EEG and sEMG fusion to detect lower limb voluntary movement, and the proposed CNN-LSTM model can achieve high performance. This work provides a stable and reliable basis for human–robot interaction of the lower limb exoskeleton.
Funder
National Natural Science Foundation of China
Key Research and Development Projects of Shaanxi Province
Fundamental Research Funds for the Central Universities
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献