Author:
Chen Jianpeng,Luo Haiwei,Huang Sihan,Zhang Meidi,Wang Guoxin,Yan Yan,Jing Shikai
Abstract
Abstract
Human-robot collaboration (HRC) plays an important role in human-centric manufacturing, which requires cooperative robots to have the ability of collaborate with human autonomously. It is very complex to understand the intention of human during the assembly process, therefore, we proposed a method of autonomous HRC assembly driven by the fusion of large language model (LLM) and digital twin in this paper. The assembly state is recognized from two perspectives, including the perception of key parts based on transfer learning and YOLO, and perceive operator actions based on LSTM and attention mechanism. In order to improve the autonomy of HRC, a collaborative task decision method driven by fine-tuning LLM based on assembly domain knowledge is proposed. A case study of reducer assembly is presented to verify the effectiveness of the proposed method.