Abstract
Abstract
Objective.
Brain-computer interface (BCI) technology based on motor imagery (MI) control has become a research hotspot but continues to encounter numerous challenges. BCI can assist in the recovery of stroke patients and serve as a key technology in robot control. Current research on MI almost exclusively focuses on the hands, feet, and tongue. Therefore, the purpose of this paper is to establish a four-class MI BCI system, in which the four types are the four articulations within the right upper limbs, involving the shoulder, elbow, wrist, and hand. Approach.
Ten subjects were chosen to perform nine upper-limb analytic movements, after which the differences were compared in P300, movement-related potentials(MRPS), and event-related desynchronization/event-related synchronization under voluntary MI (V-MI) and involuntary MI (INV-MI). Next, the cross-frequency coupling (CFC) coefficient based on mutual information was extracted from the electrodes and frequency bands with interest. Combined with the image Fourier transform and twin bounded support vector machine classifier, four kinds of electroencephalography data were classified, and the classifier’s parameters were optimized using a genetic algorithm. Main results.
The results were shown to be encouraging, with an average accuracy of 93.2% and 92.2% for V-MI and INV-MI, respectively, and over 95% for any three classes and any two classes. In most cases, the accuracy of feature extraction using the proximal articulations as the basis was found to be relatively high and had better performance. Significance.
This paper discussed four types of MI according to three aspects under two modes and classed them by combining graph Fourier transform and CFC. Accordingly, the theoretical discussion and classification methods may provide a fundamental theoretical basis for BCI interface applications.
Funder
National Key R & D Program
Subject
Cellular and Molecular Neuroscience,Biomedical Engineering
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献