Author:
Jiang Ping,Oaki Junji,Ishihara Yoshiyuki,Ooga Junichiro,Han Haifeng,Sugahara Atsushi,Tokura Seiji,Eto Haruna,Komoda Kazuma,Ogawa Akihito
Abstract
Deep learning has been widely used for inferring robust grasps. Although human-labeled RGB-D datasets were initially used to learn grasp configurations, preparation of this kind of large dataset is expensive. To address this problem, images were generated by a physical simulator, and a physically inspired model (e.g., a contact model between a suction vacuum cup and object) was used as a grasp quality evaluation metric to annotate the synthesized images. However, this kind of contact model is complicated and requires parameter identification by experiments to ensure real world performance. In addition, previous studies have not considered manipulator reachability such as when a grasp configuration with high grasp quality is unable to reach the target due to collisions or the physical limitations of the robot. In this study, we propose an intuitive geometric analytic-based grasp quality evaluation metric. We further incorporate a reachability evaluation metric. We annotate the pixel-wise grasp quality and reachability by the proposed evaluation metric on synthesized images in a simulator to train an auto-encoder–decoder called suction graspability U-Net++ (SG-U-Net++). Experiment results show that our intuitive grasp quality evaluation metric is competitive with a physically-inspired metric. Learning the reachability helps to reduce motion planning computation time by removing obviously unreachable candidates. The system achieves an overall picking speed of 560 PPH (pieces per hour).
Subject
Artificial Intelligence,Biomedical Engineering
Reference71 articles.
1. “Workspace aware online grasp planning,”;Akinola,2018
2. Dynamic grasping with reachability and motion awareness;Akinola;arXiv preprint,2021
3. “Mt-dssd: deconvolutional single shot detector using multi task learning for object detection, segmentation, and grasping detection,”;Araki,2020
4. “Graspnet: an efficient convolutional neural network for real-time grasp detection for low-powered devices,”;Asif,2018
5. “An interactive physically-based model for active suction phenomenon simulation,”;Bernardin,2019
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献