Affiliation:
1. G. H. Raisoni University, Amravati, India
Abstract
The study of correctly identifying human actions that are put to various tests is known as "human activity recognition."Human activity is the constant flow of one or more discrete actions that are necessary for advancement. A series of activities in which an individual enters a room, moves ahead, sits down, stands up, etc. is a specimen of human activity. Human activity recognition can be carried out at many abstract levels and has broad applications in the actual world, such as activity-based search, monitoring of critical sites, patient monitoring, etc. the years, researchers, engineers, and students from all across the world have explored the identification of human action.YoloV4 and DarkNet are two computer vision algorithms used in machine learning-based activity recognition that identify the actions that are performed
Reference10 articles.
1. J. Redmon, "YOLO: Real-Time Object Detection," 2020.* Pjred-die.com [online].Accessible at: ¡https://pjreddie.com/darknet/yolo/•[As of December 6, 2020]..
2. Maintaining the 2016 Ranked Keyword Search Method2. "In the next years, the amount of data generated by new surveillance cameras will grow exponentially."[Online, 12 Mar. 2018 Accessed].a website called SecurityInfoWatch.comproduced by new surveillance cameras to rise dramatically in the upcoming years
3. You Only Look Once: Unified, Real-Time Object Detection, by J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, pp. 779-788.10.1109/CVPR.2016.91 is the doi.
4. Evaluation of video activity localizations integrating quality and quantity data by C. Wolf, J. Mille, E. Lombardi, O. Celiktutan, M. Jiu, E.Dogan, G. Eren, M. Baccouche, E. Dellandrea, C.E. Bichot, C. Garcia, B. Sankur, In Computer Vision and Image Understanding (127):14-30, 2014...
5. Xiaoou, Yu Qiao, and Limin WangTang. Using trajectory-pooled deep-convolutional descriptors for action recognition.In CVPR, 2015, pp 4305–4314.