Device Position-Independent Human Activity Recognition with Wearable Sensors Using Deep Neural Networks
-
Published:2024-03-03
Issue:5
Volume:14
Page:2107
-
ISSN:2076-3417
-
Container-title:Applied Sciences
-
language:en
-
Short-container-title:Applied Sciences
Author:
Mekruksavanich Sakorn1ORCID, Jitpattanakul Anuchit23ORCID
Affiliation:
1. Department of Computer Engineering, School of Information and Communication Technology, University of Phayao, Phayao 56000, Thailand 2. Department of Mathematics, Faculty of Applied Science, King Mongkut’s University of Technology North Bangkok, Bangkok 10800, Thailand 3. Intelligent and Nonlinear Dynamic Innovations Research Center, Science and Technology Research Institute, King Mongkut’s University of Technology North Bangkok, Bangkok 10800, Thailand
Abstract
Human activity recognition (HAR) identifies people’s motions and actions in daily life. HAR research has grown with the popularity of internet-connected, wearable sensors that capture human movement data to detect activities. Recent deep learning advances have enabled more HAR research and applications using data from wearable devices. However, prior HAR research often focused on a few sensor locations on the body. Recognizing real-world activities poses challenges when device positioning is uncontrolled or initial user training data are unavailable. This research analyzes the feasibility of deep learning models for both position-dependent and position-independent HAR. We introduce an advanced residual deep learning model called Att-ResBiGRU, which excels at accurate position-dependent HAR and delivers excellent performance for position-independent HAR. We evaluate this model using three public HAR datasets: Opportunity, PAMAP2, and REALWORLD16. Comparisons are made to previously published deep learning architectures for addressing HAR challenges. The proposed Att-ResBiGRU model outperforms existing techniques in accuracy, cross-entropy loss, and F1-score across all three datasets. We assess the model using k-fold cross-validation. The Att-ResBiGRU achieves F1-scores of 86.69%, 96.23%, and 96.44% on the PAMAP2, REALWORLD16, and Opportunity datasets, surpassing state-of-the-art models across all datasets. Our experiments and analysis demonstrate the exceptional performance of the Att-ResBiGRU model for HAR applications.
Funder
Thailand Science Research and Innovation Fund University of Phayao
Reference63 articles.
1. Mekruksavanich, S., and Jitpattanakul, A. (2021). LSTM Networks Using Smartphone Data for Sensor-Based Human Activity Recognition in Smart Homes. Sensors, 21. 2. Diraco, G., Rescio, G., Caroppo, A., Manni, A., and Leone, A. (2023). Human Action Recognition in Smart Living Services and Applications: Context Awareness, Data Availability, Personalization, and Privacy. Sensors, 23. 3. Maity, S., Abdel-Mottaleb, M., and Asfour, S.S. (2021). Multimodal Low Resolution Face and Frontal Gait Recognition from Surveillance Video. Electronics, 10. 4. Khalid, A.M., Khafaga, D.S., Aldakheel, E.A., and Hosny, K.M. (2023). Human Activity Recognition Using Hybrid Coronavirus Disease Optimization Algorithm for Internet of Medical Things. Sensors, 23. 5. Stisen, A., Blunck, H., Bhattacharya, S., Prentow, T.S., Kjærgaard, M.B., Dey, A., Sonne, T., and Jensen, M.M. (2015, January 1–4). Smart Devices Are Different: Assessing and MitigatingMobile Sensing Heterogeneities for Activity Recognition. Proceedings of the 13th ACM Conference on Embedded Networked Sensor Systems (SenSys’15), Seoul, Republic of Korea.
|
|