Author:
Chen Shang-Liang,Huang Li-Wu
Abstract
In this study, the robot arm control, computer vision, and deep learning technologies are combined to realize an automatic control program. There are three functional modules in this program, i.e., the hand gesture recognition module, the robot arm control module, and the communication module. The hand gesture recognition module records the user’s hand gesture images to recognize the gestures’ features using the YOLOv4 algorithm. The recognition results are transmitted to the robot arm control module by the communication module. Finally, the received hand gesture commands are analyzed and executed by the robot arm control module. With the proposed program, engineers can interact with the robot arm through hand gestures, teach the robot arm to record the trajectory by simple hand movements, and call different scripts to satisfy robot motion requirements in the actual production environment.
Publisher
Taiwan Association of Engineering and Technology Innovation
Subject
Electrical and Electronic Engineering,Mechanical Engineering,Mechanics of Materials,Civil and Structural Engineering
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献