Affiliation:
1. Computer and Information Department, Electronics Engineering College, Ninevah University, Mosul 41002, Iraq
2. Department of Computer Science, Cihan University Sulaimaniya, Sulaimaniya 46001, Iraq
Abstract
Nowadays, much research attention is focused on human–computer interaction (HCI), specifically in terms of biosignal, which has been recently used for the remote controlling to offer benefits especially for disabled people or protecting against contagions, such as coronavirus. In this paper, a biosignal type, namely, facial emotional signal, is proposed to control electronic devices remotely via emotional vision recognition. The objective is converting only two facial emotions: a smiling or nonsmiling vision signal captured by the camera into a remote control signal. The methodology is achieved by combining machine learning (for smiling recognition) and embedded systems (for remote control IoT) fields. In terms of the smiling recognition, GENKl-4K database is exploited to train a model, which is built in the following sequenced steps: real-time video, snapshot image, preprocessing, face detection, feature extraction using HOG, and then finally SVM for the classification. The achieved recognition rate is up to 89% for the training and testing with 10-fold validation of SVM. In terms of IoT, the Arduino and MCU (Tx and Rx) nodes are exploited for transferring the resulting biosignal remotely as a server and client via the HTTP protocol. Promising experimental results are achieved by conducting experiments on 40 individuals who participated in controlling their emotional biosignals on several devices such as closing and opening a door and also turning the alarm on or off through Wi-Fi. The system implementing this research is developed in Matlab. It connects a webcam to Arduino and a MCU node as an embedded system.
Subject
Human-Computer Interaction
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献