Affiliation:
1. Raghu Institute of Technology, Visakhapatnam, AP, India
Abstract
Human Computer Interface is the study of how humans and computers interact. Hand gestures are a great way to communicate with people when they don’t understand exactly what we are saying. Understanding hand gestures is essential to make sure the listener understands what we are saying. The main idea of our project is to try different approaches to hand gesture recognition. This proposed work first with radar data and then with camera sensor to achieve hand gesture recognition. First, we tried to build hand gesture recognition using radar data, and since most people don’t know sign language and few interpreters, we developed an approach to real-time approach for American Sign Language based on neural networks finger spelling followed by another model with Media Pipe. We propose a complex neural network method to detect hand gestures of human behaviour from camera recorded images. The hand gesture first goes through the filter and after applying the filter the gesture goes through a classifier that predicts which type of hand gesture it is. In an existing system radar unable to detect static gestures in our approach, a deep learning-based image captioning algorithm captures both static and dynamic gestures through Media Pipe.