FabricFolding: learning efficient fabric folding without expert demonstrations
Author:
He CanORCID, Meng Lingxiao, Sun ZhiruiORCID, Wang JiankunORCID, Meng Max Q.-H.
Abstract
AbstractAutonomous fabric manipulation is a challenging task due to complex dynamics and potential self-occlusion during fabric handling. An intuitive method of fabric-folding manipulation first involves obtaining a smooth and unfolded fabric configuration before the folding process begins. However, the combination of quasi-static actions like pick & place and dynamic action like fling proves inadequate in effectively unfolding long-sleeved T-shirts with sleeves mostly tucked inside the garment. To address this limitation, this paper introduces an enhanced quasi-static action called pick & drag, specifically designed to handle this type of fabric configuration. Additionally, an efficient dual-arm manipulation system is designed in this paper, which combines quasi-static (including pick & place and pick & drag) and dynamic fling actions to flexibly manipulate fabrics into unfolded and smooth configurations. Subsequently, once it is confirmed that the fabric is sufficiently unfolded and all fabric keypoints are detected, the keypoint-based heuristic folding algorithm is employed for the fabric-folding process. To address the scarcity of publicly available keypoint detection datasets for real fabric, we gathered images of various fabric configurations and types in real scenes to create a comprehensive keypoint dataset for fabric folding. This dataset aims to enhance the success rate of keypoint detection. Moreover, we evaluate the effectiveness of our proposed system in real-world settings, where it consistently and reliably unfolds and folds various types of fabrics, including challenging situations such as long-sleeved T-shirts with most parts of sleeves tucked inside the garment. Specifically, our method achieves a coverage rate of 0.822 and a success rate of 0.88 for long-sleeved T-shirts folding. Supplemental materials and dataset are available on our project webpage at https://sites.google.com/view/fabricfolding.
Publisher
Cambridge University Press (CUP)
Reference37 articles.
1. [19] Lim, V. , Huang, H. , Chen, L. Y. , Wang, J. , Ichnowski, J. , Seita, D. , Laskey, M. and Goldberg, K. , “Real2sim2real: Self-supervised Learning of Physical Single-step Dynamic Actions for Planar Robot Casting,” In: 2022 IEEE International Conference on Robotics and Automation (ICRA), (2022) pp. 8282–8289. 2. [32] Weng, T. , Bajracharya, S. M. , Wang, Y. , Agrawal, K. and Held, D. , “Fabricflownet: Bimanual Cloth Manipulation with a Flow-based Policy,” In: Proceedings of the 2022 Conference on Robot Learning, PMLR 164, pp.192–202 (2022). 3. [4] Ren, K. , Kavraki, L. E. and Hang, K. , “Rearrangement-based Manipulation via Kinodynamic Planning and Dynamic Planning Horizons,” In: 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), (2022) pp. 1145–1152. 4. [31] Thach, B. , Cho, B. Y. , Kuntz, A. and Hermans, T. , “Learning Visual Shape Control of Novel 3D Deformable Objects from Partial-view Point Clouds,” In: 2022 IEEE International Conference on Robotics and Automation (ICRA), (2022) pp. 8274–8281. 5. [34] Ronneberger, O. , Fischer, P. and Brox, T. , “U-net: Convolutional Networks for Biomedical Image Segmentation,” In: Proceedings of the 2015 Medical Image Computing and Computer-Assisted Intervention (MICCAI), Springer (2015) pp. 234–241.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. POE: Acoustic Soft Robotic Proprioception for Omnidirectional End-effectors;2024 IEEE International Conference on Robotics and Automation (ICRA);2024-05-13
|
|