Author:
Xiao Lei,Luo Kangrong,Liu Juntong,Foroughi Andia
Abstract
AbstractSmartphone sensors have gained considerable traction in Human Activity Recognition (HAR), drawing attention for their diverse applications. Accelerometer data monitoring holds promise in understanding students’ physical activities, fostering healthier lifestyles. This technology tracks exercise routines, sedentary behavior, and overall fitness levels, potentially encouraging better habits, preempting health issues, and bolstering students’ well-being. Traditionally, HAR involved analyzing signals linked to physical activities using handcrafted features. However, recent years have witnessed the integration of deep learning into HAR tasks, leveraging digital physiological signals from smartwatches and learning features automatically from raw sensory data. The Long Short-Term Memory (LSTM) network stands out as a potent algorithm for analyzing physiological signals, promising improved accuracy and scalability in automated signal analysis. In this article, we propose a feature analysis framework for recognizing student activity and monitoring health based on smartphone accelerometer data through an edge computing platform. Our objective is to boost HAR performance by accounting for the dynamic nature of human behavior. Nonetheless, the current LSTM network’s presetting of hidden units and initial learning rate relies on prior knowledge, potentially leading to suboptimal states. To counter this, we employ Bidirectional LSTM (BiLSTM), enhancing sequence processing models. Furthermore, Bayesian optimization aids in fine-tuning the BiLSTM model architecture. Through fivefold cross-validation on training and testing datasets, our model showcases a classification accuracy of 97.5% on the tested dataset. Moreover, edge computing offers real-time processing, reduced latency, enhanced privacy, bandwidth efficiency, offline capabilities, energy efficiency, personalization, and scalability. Extensive experimental results validate that our proposed approach surpasses state-of-the-art methodologies in recognizing human activities and monitoring health based on smartphone accelerometer data.
Publisher
Springer Science and Business Media LLC
Reference47 articles.
1. Dastbaravardeh, E., Askarpour, S., Saberi Anari, M. & Rezaee, K. Channel attention-based approach with autoencoder network for human action recognition in low-resolution frames. Int. J. Intell. Syst. (2024)
2. Jha, S., Schiemer, M., Zambonelli, F. & Ye, J. Continual learning in sensor-based human activity recognition: An empirical benchmark analysis. Inf. Sci. 575, 1–21 (2021).
3. Slemenšek, J. et al. Human gait activity recognition machine learning methods. Sensors 23(2), 745 (2023).
4. Bocus, M. J. et al. OPERAnet, a multimodal activity recognition dataset acquired from radio frequency and vision-based sensors. Sci. Data 9(1), 474 (2022).
5. Wang, Y., Cang, S. & Yu, H. A survey on wearable sensor modality centred human activity recognition in health care. Expert Syst. Appl. 137, 167–190 (2019).