Multimodal grasp data set: A novel visual–tactile data set for robotic manipulation

Author:

Wang Tao12ORCID,Yang Chao3,Kirchner Frank24,Du Peng5,Sun Fuchun3,Fang Bin3

Affiliation:

1. Intel Labs China, Beijing, China

2. Faculty of Mathematics and Computer Science, University of Bremen, Bremen, Germany

3. Department of Computer Science and Technology, Tsinghua University, Beijing, China

4. Robotics Innovation Center (DFKI RIC), German Research Center for Artificial Intelligence GmbH, Bremen, Germany

5. Machine Intelligence Institute, University of Electronic Science and Technology of China, Chengdu, China

Abstract

This article introduces a visual–tactile multimodal grasp data set, aiming to further the research on robotic manipulation. The data set was built by the novel designed dexterous robot hand, the Intel’s Eagle Shoal robot hand (Intel Labs China, Beijing, China). The data set contains 2550 sets data, including tactile, joint, time label, image, and RGB and depth video. With the integration of visual and tactile data, researchers could be able to better understand the grasping process and analyze the deeper grasping issues. In this article, the building process of the data set was introduced, as well as the data set composition. In order to evaluate the quality of data set, the tactile data were analyzed by short-time Fourier transform. The tactile data–based slip detection was realized by long short-term memory and contrasted with visual data. The experiments compared the long short-term memory with the traditional classifiers, and generalization ability on different grasp directions and different objects is implemented. The results have proved that the data set’s value in promoting research on robotic manipulation area showed the effective slip detection and generalization ability of long short-term memory. Further work on visual and tactile data will be devoted to in the future.

Funder

Intel Corporation

Publisher

SAGE Publications

Subject

Artificial Intelligence,Computer Science Applications,Software

Cited by 18 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Visual and corresponding tactile dataset of flexible material for robots and cross modal perception;Data in Brief;2024-10

2. A Novel Visuo-Tactile Object Recognition Pipeline using Transformers with Feature Level Fusion;2024 International Joint Conference on Neural Networks (IJCNN);2024-06-30

3. Tactile-sensing-based robotic grasping stability analysis;Science China Technological Sciences;2024-05-29

4. Grasp Stability Prediction with Time Series Data Based on STFT and LSTM;2023 International Conference on Advanced Robotics and Mechatronics (ICARM);2023-07-08

5. Visuo-haptic object perception for robots: an overview;Autonomous Robots;2023-03-14

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3