Abstract
It is a challenging task to track objects moving along an unknown trajectory. Conventional model-based controllers require detailed knowledge of a robot’s kinematics and the target’s trajectory. Tracking precision heavily relies on kinematics to infer the trajectory. Control implementation in parallel robots is especially difficult due to their complex kinematics. Vision-based controllers are robust to uncertainties of a robot’s kinematic model since they can correct end-point trajectories as error estimates become available. Robustness is guaranteed by taking the vision sensor’s model into account when designing the control law. All camera space manipulation (CSM) models in the literature are position-based, where the mapping between the end effector position in the Cartesian space and sensor space is established. Such models are not appropriate for tracking moving targets because the relationship between the target and the end effector is a fixed point. The present work builds upon the literature by presenting a novel CSM velocity-based control that establishes a relationship between a movable trajectory and the end effector position. Its efficacy is shown on a Delta-type parallel robot. Three types of experiments were performed: (a) static tracking (average error of 1.09 mm); (b) constant speed linear trajectory tracking—speeds of 7, 9.5, and 12 cm/s—(tracking errors of 8.89, 11.76, and 18.65 mm, respectively); (c) freehand trajectory tracking (max tracking errors of 11.79 mm during motion and max static positioning errors of 1.44 mm once the object stopped). The resulting control cycle time was 48 ms. The results obtained show a reduction in the tracking errors for this robot with respect to previously published control strategies.
Funder
Consejo Nacional de Ciencia y Tecnología
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献