Abstract
Human–computer interface (HCI) methods based on the electrooculogram (EOG) signals generated from eye movement have been continuously studied because they can transmit the commands to a computer or machine without using both arms. However, usability and appearance are the big obstacles to practical applications since conventional EOG-based HCI methods require skin electrodes outside the eye near the lateral and medial canthus. To solve these problems, in this paper, we report development of an HCI method that can simultaneously acquire EOG and surface-electromyogram (sEMG) signals through electrodes integrated into bone conduction headphones and transmit the commands through the horizontal eye movements and various biting movements. The developed system can classify the position of the eyes by dividing the 80-degree range (from −40 degrees to the left to +40 degrees to the right) into 20-degree sections and can also recognize the three biting movements based on the bio-signals obtained from the three electrodes, so a total of 11 commands can be delivered to a computer or machine. The experimental results showed the interface has accuracy of 92.04% and 96.10% for EOG signal-based commands and sEMG signal-based commands, respectively. As for the results of virtual keyboard interface application, the accuracy was 97.19%, the precision was 90.51%, and the typing speed was 5.75–18.97 letters/min. The proposed interface system can be applied to various HCI and HMI fields as well as virtual keyboard applications.
Funder
Catholic University of Korea
National Research Foundation of Korea
Subject
Electrical and Electronic Engineering,Computer Networks and Communications,Hardware and Architecture,Signal Processing,Control and Systems Engineering
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献