Human Action Detection and Recognition: A Pragmatic Approach using Multiple Feature Extraction Techniques and Convolutional Neural Networks

Author:

Basavaiah Jagadeesh1ORCID,Anthony Audre Arlene1

Affiliation:

1. Vidyavardhaka College of Engineering

Abstract

Abstract Action recognition is described as the capability of determining the action that a human exhibit in the video. Latest innovations in either deep-learning or hand-crafted methods substantially increased the accuracy of action recognition. However, there are many issues, which keep action recognition task far from being solved. The task of human action recognition persists to be complicated and challenging due to the high complexity associated with human actions such as motion pattern variation, appearance variation, viewpoint variation, occlusions, background variation and camera motion. This paper presents a computational approach for human action recognition using video datasets through different stages: Detection, tracking of human and recognition of actions. Human detection and tracking are carried out using Gaussian Mixture Model (GMM) and Kalman filtering respectively. Different feature extraction techniques such as Scale Invariant Feature Transform (SIFT), Optical Flow Estimation, Bi-dimensional Empirical Mode Decomposition (BEMD), Discrete Wavelet Transform (DWT) are used to extract optimal features from the video frames. The features are fed to the Convolutional Neural Network classifier to recognize and classify the actions. Three datasets viz. KTH, Weizmann and Own created datasets are used to evaluate the performance of the developed method. Using SIFT, BEMD and DWT multiple feature extraction technique, the proposed method is called Hybrid Feature Extraction – Convolutional Neural Network based Video Action Recognition (HFE-CNN-VAR) method. The results of the work demonstrated that the HFE-CNN-VAR method enhanced the accuracy of action classification. The accuracy of classification is 99.33% for Weizmann dataset, 99.01% for KTH dataset and 90% for own dataset. Results of the experiment and comparative analysis shows that proposed approach surpasses when compared with other contemporary techniques.

Publisher

Research Square Platform LLC

Reference31 articles.

1. Xinyu Wu, Y., Ou, H., Qian, & Xu, Y. (2005). "A detetion system for human abnormal behavior," IEEE/RSJ International Conference on Intelligent Robots and Systems, 2005, pp. 1204–1208, https://doi.org/10.1109/IROS.2005.1545205

2. "Deep convolutional framework for abnormal behavior detection in a smart surveillance system.";Ko KE;Engineering Applications of Artificial Intelligence vol,2018

3. Aran, O., Sanchez-Cortes, D., Do, M. T., & Gatica-Perez, D. (2016). Anomaly Detection in Elderly Daily Behavior in Ambient Sensing Environments. In M. Chetouani, J. Cohn, & A. Salah (Eds.), Human Behavior Understanding. HBU 2016 (9997 vol.). Cham: Springer. Lecture Notes in Computer Science()https://doi.org/10.1007/978-3-319-46843-3_4.

4. Gutzeit, L., Otto, M., Kirchner, E. A. Simple and Robust Automatic Detection and Recognition of Human Movement Patterns in Tasks of Different Complexity. In: Holzinger, A., Pope, A., Plácido, Jin, C. B., Li, S., & Kim, H. (2019). 2017. Real-time action detection in video surveillance using sub-action descriptor with multi-cnn, https://doi.org/10.48550/arXiv.1710.03383

5. Ballan, L., Bertini, M., Del Bimbo, A., Seidenari, L., & Serra, G. (2009). December. Human action recognition and localization using spatio-temporal descriptors and tracking. In Proceedings of the Workshop on Pattern Recognition and Artificial Intelligence for Human Behaviour Analysis, Reggio Emilia, Italy (pp. 1–8).

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3