Abstract
For decades, traditional computer interfaces such as keyboards and mouse have been the major way of interaction. These interfaces, however, can be restricted, particularly in situations when hands-free or realistic contact is sought. Gesture-based interactions are made possible by the use of wearable devices such as smartwatches or motion-capture sensors, which allow people to communicate with computers through natural hand and body gestures.Gesture-based Human-Computer Interaction (HCI) is the technique of transmitting orders or input to a computer system using physical gestures such as hand movements, body movements, or facial expressions rather than standard input devices such as keyboards or touchpads. Gestures are a natural and intrinsic means for humans to communicate with one another. When gesture-based HCI is combined with wearable devices, people may interact with computers in a more intuitive and human-like manner. This natural contact improves the user experience and shortens the learning curve for computer systems. Gesture-based HCI is an alternative interaction style that can considerably help those with a physical disability or mobility issues. It allows for hands-free control, making technology more accessible to a wider variety of people, independent of physical ability. Gesture-based interactions have the potential to improve the efficiency of specific jobs, such as presentations, design work, and managing IoT devices. Because users can execute tasks quickly using simple gestures, it can lead to increased productivity and efficiency.