Fast UOIS: Unseen Object Instance Segmentation with Adaptive Clustering for Industrial Robotic Grasping
-
Published:2024-08-09
Issue:8
Volume:13
Page:305
-
ISSN:2076-0825
-
Container-title:Actuators
-
language:en
-
Short-container-title:Actuators
Author:
Fu Kui12ORCID, Dang Xuanju1, Zhang Qingyu1ORCID, Peng Jiansheng2
Affiliation:
1. School of Electronic and Automation, Guilin University of Electronic Technology, Guilin 541004, China 2. School of Artificial Intelligence and Smart Manufacturing, Hechi University, Hechi 546300, China
Abstract
Segmenting unseen object instances in unstructured environments is an important skill for robots to perform grasping-related tasks, where the trade-off between efficiency and accuracy is an urgent challenge to be solved. In this work, we propose a fast unseen object instance segmentation (Fast UOIS) method that utilizes predicted center offsets of objects to compute the positions of local maxima and minima, which are then used for selecting initial seed points required by the mean-shift clustering algorithm. This clustering algorithm that adaptively generates seed points can quickly and accurately obtain instance masks of unseen objects. Accordingly, Fast UOIS first generates pixel-wise predictions of object classes and center offsets from synthetic depth images. Then, these predictions are used by the clustering algorithm to calculate initial seed points and to find possible object instances. Finally, the depth information corresponding to the filtered instance masks is fed into the grasp generation network to generate grasp poses. Benchmark experiments show that our method can be well transferred to the real world and can quickly generate sharp and accurate instance masks. Furthermore, we demonstrate that our method is capable of segmenting instance masks of unseen objects for robotic grasping.
Funder
National Natural Science Foundation of China Science and Technology Plan Project of Guangxi Research Project of Hechi University
Reference38 articles.
1. Learning hand-eye coordination for robotic grasping with large-scale data collection;Levine;Int. J. Exp. Robot. Res.,2018 2. Robotic pick-and-place of novel objects in clutter with multi-affordance grasping and cross-domain image matching;Zeng;Int. J. Exp. Robot. Res.,2022 3. Bicchi, A., and Vijay, K. (2000, January 24–28). Robotic grasping and contact: A review. Proceedings of the IEEE International Conference on Robotic Automation, San Francisco, CA, USA. 4. Rubert, C., Kappler, D., Morales, A., Schaal, S., and Bohg, J. (2017, January 24–28). On the relevance of grasp metrics for predicting grasp success. Proceedings of the IEEE International Conference on Robotic Automation, Vancouver, BC, Canada. 5. Jiang, Y., Moseson, S., and Saxena, A. (2011, January 9–13). Efficient grasping from rgbd images: Learning using a new rectangle representation. Proceedings of the IEEE International Conference on Robotic Automation, Shanghai, China.
|
|