Abstract
AbstractThis paper introduces the Human Action Multi-Modal Monitoring in Manufacturing (HA4M) dataset, a collection of multi-modal data relative to actions performed by different subjects building an Epicyclic Gear Train (EGT). In particular, 41 subjects executed several trials of the assembly task, which consists of 12 actions. Data were collected in a laboratory scenario using a Microsoft® Azure Kinect which integrates a depth camera, an RGB camera, and InfraRed (IR) emitters. To the best of authors’ knowledge, the HA4M dataset is the first multi-modal dataset about an assembly task containing six types of data: RGB images, Depth maps, IR images, RGB-to-Depth-Aligned images, Point Clouds and Skeleton data. These data represent a good foundation to develop and test advanced action recognition systems in several fields, including Computer Vision and Machine Learning, and application domains such as smart manufacturing and human-robot collaboration.
Publisher
Springer Science and Business Media LLC
Subject
Library and Information Sciences,Statistics, Probability and Uncertainty,Computer Science Applications,Education,Information Systems,Statistics and Probability
Reference44 articles.
1. Özyer, T., Ak, D. S. & Alhajj, R. Human action recognition approaches with video datasets — a survey. Knowledge-Based Systems 222, 1–36 (2021).
2. Mahbub, U. & Ahad, M. A. R. Advances in human action, activity and gesture recognition. Pattern Recognition Letters 155, 186–190 (2022).
3. Zhang, H. B. et al. A Comprehensive Survey of Vision-Based Human Action Recognition Methods. Sensors 19, 1–20 (2019).
4. Wang, J., Chen, Y., Hao, S., Peng, X. & Hu, L. Deep learning for sensor-based activity recognition: A survey. Pattern Recognition Letters 119, 3–11 (2019).
5. Jegham, I., Khalifa, A. B., Alouani, I. & Mahjoub, M. A. Vision-based human action recognition: An overview and real world challenges. Forensic Science International: Digital Investigation 32, 1–17 (2020).
Cited by
16 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献