DEEP LEARNING BASED HUMAN ROBOT INTERACTION WITH 5G COMMUNICATION
Author:
BARSTUĞAN Mücahid1ORCID, OSMANPAŞAOĞLU Zeynep2ORCID
Affiliation:
1. Konya Teknik Üniversitesi 2. MARMARA UNIVERSITY, FACULTY OF TECHNOLOGY
Abstract
Factories focusing on digital transformation accelerate their production and surpass their competitors by increasing their controllability and efficiency. In this study, the data obtained by image processing with the aim of digital transformation was transferred to the collaborative robot arm with 5G communication and the robot arm was remotely controlled. A 3D-printed humanoid hand is mounted on the end of the robot arm for bin picking. Each finger is controlled by five servo motors. For finger control, the user wore a glove, and the finger positions of the user were transferred to the servo motors thanks to each flex sensor attached to the glove. In this way, the desired pick and place process is provided. The position control of the robot arm was realized with image processing. The gloves worn by the user were determined by two different YOLO (You only look once) methods. YOLOv4 and YOLOv5 algorithms were compared by using Python software language in object detection. While the highest detection accuracy obtained with the YOLOv4 algorithm during the test phase was 99.75% in the front camera, it was 99.83% in the YOLOv5 algorithm; YOLOv4 detection accuracy was the highest in the side camera of 97.59%, and YOLOv5 detection accuracy was 97.9%.
Funder
Mitsubishi Electric Türkiye
Publisher
Konya Muhendislik Bilimleri Dergisi
Reference34 articles.
1. [1] X. Chen, X. Huang, Y. Wang, and X. Gao, "Combination of augmented reality based brain- computer interface and computer vision for high-level control of a robotic arm," IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 28, no. 12, pp. 3140-3147, 2020. 2. [2] Z. Zhang, Y. Huang, S. Chen, J. Qu, X. Pan, T. Yu, and Y. Li, "An intention-driven semi- autonomous intelligent robotic system for drinking," Frontiers in Neurorobotics, vol. 11, p. 1-14, 2017. 3. [3] S. M. Achari, S. G. Mirji, C. P. Desai, M. S. Hulasogi, and S. P. Awari, "Gesture based wireless control of robotic hand using image processing," International Research Journal of Engineering and Technology, vol. 5, no. 5, pp. 3340-3345, 2018. 4. [4] J. O. P. Arenas, R. J. Moreno, and R. D. H. Beleño, "Convolutional neural network with a dag architecture for control of a robotic arm by means of hand gestures," Contemporary Engineering Sciences, vol. 11, no. 12, pp. 547-557, 2018. 5. [5] P. Atre, S. Bhagat, N. Pooniwala, and P. Shah, "Efficient and feasible gesture controlled robotic arm," in 2018 Second International Conference on Intelligent Computing and Control Systems, 2018, pp. 1-6: IEEE.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Classification of Traffic Signs Using Transfer Learning Methods;Afyon Kocatepe University Journal of Sciences and Engineering;2024-07-23
|
|