Deep Learning Techniques for Radar-Based Continuous Human Activity Recognition

Author:

Mehta Ruchita1ORCID,Sharifzadeh Sara2ORCID,Palade Vasile1ORCID,Tan Bo3ORCID,Daneshkhah Alireza1ORCID,Karayaneva Yordanka4

Affiliation:

1. Centre for Computational Science & Mathematical Modelling, Coventry University, Coventry CV1 5FB, UK

2. Department of Computer Science, Swansea University, Swansea SA1 8EN, UK

3. Faculty of Technology and Communication Sciences, Tampere University, 33100 Tampere, Finland

4. School of Computing, Engineering and Digital Technologies, Teesside University, Middlesbrough TS1 3BX, UK

Abstract

Human capability to perform routine tasks declines with age and age-related problems. Remote human activity recognition (HAR) is beneficial for regular monitoring of the elderly population. This paper addresses the problem of the continuous detection of daily human activities using a mm-wave Doppler radar. In this study, two strategies have been employed: the first method uses un-equalized series of activities, whereas the second method utilizes a gradient-based strategy for equalization of the series of activities. The dynamic time warping (DTW) algorithm and Long Short-term Memory (LSTM) techniques have been implemented for the classification of un-equalized and equalized series of activities, respectively. The input for DTW was provided using three strategies. The first approach uses the pixel-level data of frames (UnSup-PLevel). In the other two strategies, a convolutional variational autoencoder (CVAE) is used to extract Un-Supervised Encoded features (UnSup-EnLevel) and Supervised Encoded features (Sup-EnLevel) from the series of Doppler frames. The second approach for equalized data series involves the application of four distinct feature extraction methods: i.e., convolutional neural networks (CNN), supervised and unsupervised CVAE, and principal component Analysis (PCA). The extracted features were considered as an input to the LSTM. This paper presents a comparative analysis of a novel supervised feature extraction pipeline, employing Sup-ENLevel-DTW and Sup-EnLevel-LSTM, against several state-of-the-art unsupervised methods, including UnSUp-EnLevel-DTW, UnSup-EnLevel-LSTM, CNN-LSTM, and PCA-LSTM. The results demonstrate the superiority of the Sup-EnLevel-LSTM strategy. However, the UnSup-PLevel strategy worked surprisingly well without using annotations and frame equalization.

Funder

Coventry University

Publisher

MDPI AG

Subject

Artificial Intelligence,Engineering (miscellaneous)

Reference45 articles.

1. WHO (2021, June 23). Ageing and Health, Available online: https://www.who.int/news-room/fact-sheets/detail/ageing-and-health.

2. Walker, J.L., Grint, D.J., Strongman, H., Eggo, R.M., Peppa, M., Minassian, C., Mansfield, K.E., Rentsch, C.T., Douglas, I.J., and Mathur, R. (2021). UK prevalence of underlying conditions which increase the risk of severe COVID-19 disease: A point prevalence study using electronic health records. BMC Public Health, 21.

3. Robineau, D. (2021, June 23). Ageing Britain: Two-fifths of NHS budget is spent on over-65s. Guardian 2016. Available online: https://www.theguardian.com/society/2016/feb/01/ageing-britain-two-fifths-nhs-budget-spent-over-65s.

4. Lee, S.M., Yoon, S.M., and Cho, H. (2017, January 13–16). Human activity recognition from accelerometer data using Convolutional Neural Network. Proceedings of the 2017 IEEE International Conference on Big Data and Smart Computing (BigComp), Jeju, Republic of Korea.

5. Deep learning based human activity recognition (HAR) using wearable sensor data;Gupta;Int. J. Inf. Manag. Data Insights,2021

Cited by 1 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3