Abstract
AbstractHuman-machine interfaces contribute to the improvement of the life quality of physically disabled users. In this study, a non-invasive brain-machine interface (BMI) design methodology was proposed to control a robot arm through magnetoencephalography (MEG) based on directionally modulated MEG activity that was acquired during the user’s imagined wrist movements in four various directions. The partial directed coherence (PDC) measure derived from functional connectivity between cortical brain regions was utilized in the feature extraction process. The time-varying parameters were estimated based on a time-varying multivariate adaptive autoregressive (AAR) model, that can detect task-dependent features and non-symmetric channel relevance for mental task discrimination. An extreme learning machine (ELM), that utilizes Moore-Penrose (MP) generalized inverse to set its weights and does not necessitate a gradient-based backpropagation algorithm was employed to generate a model with the extracted feature set. The output of the task classification model was embedded into the robotic arm model for realizing control-based tasks. The classification results dictate that the proposed BMI methodology is a feasible solution for rehabilitation or assistance systems that are devised to help motor-impaired people. The proposed methodology provides very satisfactory classification performance at a fast learning speed.
Publisher
Cold Spring Harbor Laboratory