Author:
Baktash Abdullah Qassim,Mohammed Saleem Latteef,Jameel Huda Farooq
Abstract
Abstract
Many people around the world suffer from losing the ability to talk and hear with different levels of disabilities, caused by either a car or a work accident or some diseases. After losing communication, these people cannot do normal functions of normal life. Along with the aforementioned disabilities, those people may also have psychological effects. This paper introduces a technique to realize multiple sign language translation using a sensors-based glove and an Android smartphone for speech impaired people to communicate normally with people. The design of the hand talking system (HTS) was implemented with a minimum possible number of sensors and a capable sewing controller (Lilypad). The proposed HTS includes flex sensors, Arduino, smartphone, and accelerometer. The HTS uses an Android application programmed to store multi-language in the SQLite database and enables the user to interact with the system. The system provides talking with a letter formed words, or using the most frequently used words in daily communication by hand gesture. The HTS has achieved high accuracy obtained for American Sign Language and Arabic Sing Language which are about 98.26% and 99.33% respectively with an average accuracy of 98.795 for both Sign Languages.
Reference34 articles.
1. Sign Language Recognition Application Systems for Deaf-Mute People: A Review Based on Input-Process-Output;Suharjito;Procedia Computer Science,2017
2. Electronic speaking system for speech impaired people: Speak up;Ahmed,2015
3. Design and development of hand gesture recognition system for speech impaired people;Harish,2015
4. A remote conversation support system for deaf-mute persons based on bimanual gestures recognition using finger-worn devices;Kuroki,2015
5. Prototype Arabic Sign language recognition using multi-sensor data fusion of two leap motion controllers;Mohandes,2015
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献