Affiliation:
1. Mohan Babu University, India
2. Sai Ram Engineering College, Chennai, India
Abstract
Day-to-day life has become smarter and more intertwined with technology in the modern era. We can chat with AI chatbots, and we can play with AI machines. In many cases, AI machines are defeating human players. All age groups are likely to play these games. The computer gaming industry has found Artificial intelligence as a necessary element to make more entertaining and challenging games. This work involves the integration of rock-paper-scissors (RPS) game with artificial intelligence (AI) using OpenCV and MediaPipe. Through collaboration with designers and animators, the framework is designed to be computationally light while also entertaining and visually appealing. To evaluate kinematic hand data in Python, the basic gesture recognition pipeline emp1oys a 1eap motion device and two distinct machine 1earning architectures. This proposed system will provide a powerful application for future research into social human-machine interaction.
Reference16 articles.
1. Bowden, R., Zisserman, A., Kadir, T., & Brady, M. (2003). Vision based interpretation of natural sign languages. In Exhibition at ICVS03: the 3rd international conference on computer vision systems. ACM Press.
2. Real time face and object tracking as a component of a perceptual user interface
3. Brock, H., Sabanovic, S., Nakamura, K., & Gomez, R. (2020). Robust real-time hand gestural recognition for non-verbal communication with tabletop robot haru. Proc. 29th IEEE Int. Conf. Robot Human Interact. Commun. (RO-MAN), 891–898.
4. Cao, X., & Balakrishnan, R. (2003). Visionwand: interaction techniques for large displays using a passive wand tracked in 3d. In UIST ’03: proceedings of the 16th annual ACM symposium on User Interface software and technology. ACM Press.
5. Locating facial region of a head-and-shoulders color image