Enhancing Human Activity Recognition in Smart Homes with Self-Supervised Learning and Self-Attention

Author:

Chen Hui1ORCID,Gouin-Vallerand Charles1ORCID,Bouchard Kévin2ORCID,Gaboury Sébastien2ORCID,Couture Mélanie3ORCID,Bier Nathalie4ORCID,Giroux Sylvain1ORCID

Affiliation:

1. Department of Computer Science, Université de Sherbrooke, 2500 Bd de l’Université, Sherbrooke, QC J1K 2R1, Canada

2. Department of Computer Science and Mathematics, Université du Québec à Chicoutimi, 555 Bd de l’Université, Chicoutimi, QC G7H 2B1, Canada

3. Faculty of Arts and Humanities, Université de Sherbrooke, 2500 Bd de l’Université, Sherbrooke, QC J1K 2R1, Canada

4. School of Rehabilitation, Faculty of Medicine, Université de Montréal, 2900 Bd Édouard-Montpetit, Montréal, QC H3T 1J4, Canada

Abstract

Deep learning models have gained prominence in human activity recognition using ambient sensors, particularly for telemonitoring older adults’ daily activities in real-world scenarios. However, collecting large volumes of annotated sensor data presents a formidable challenge, given the time-consuming and costly nature of traditional manual annotation methods, especially for extensive projects. In response to this challenge, we propose a novel AttCLHAR model rooted in the self-supervised learning framework SimCLR and augmented with a self-attention mechanism. This model is designed for human activity recognition utilizing ambient sensor data, tailored explicitly for scenarios with limited or no annotations. AttCLHAR encompasses unsupervised pre-training and fine-tuning phases, sharing a common encoder module with two convolutional layers and a long short-term memory (LSTM) layer. The output is further connected to a self-attention layer, allowing the model to selectively focus on different input sequence segments. The incorporation of sharpness-aware minimization (SAM) aims to enhance model generalization by penalizing loss sharpness. The pre-training phase focuses on learning representative features from abundant unlabeled data, capturing both spatial and temporal dependencies in the sensor data. It facilitates the extraction of informative features for subsequent fine-tuning tasks. We extensively evaluated the AttCLHAR model using three CASAS smart home datasets (Aruba-1, Aruba-2, and Milan). We compared its performance against the SimCLR framework, SimCLR with SAM, and SimCLR with the self-attention layer. The experimental results demonstrate the superior performance of our approach, especially in semi-supervised and transfer learning scenarios. It outperforms existing models, marking a significant advancement in using self-supervised learning to extract valuable insights from unlabeled ambient sensor data in real-world environments.

Funder

AGE-WELL

Publisher

MDPI AG

Reference60 articles.

1. United Nations (2023, November 02). World Social Report 2023: Leaving No One Behind an an Ageing World. Available online: https://desapublications.un.org/publications/world-social-report-2023-leaving-no-one-behind-ageing-world.

2. Centers for Disease Control and Prevention (2023, November 02). Healthy Places Terminology, Available online: https://www.cdc.gov/healthyplaces/terminology.htm.

3. National Insitute on Aging (NIA) (2023, November 02). Aging in Place: Growing Older at Home, Available online: https://www.nia.nih.gov/health/aging-place/aging-place-growing-older-home.

4. Government of Canada (2023, November 02). Thinking about Your Future? Plan Now to Age in Place—A Checklist, Available online: https://www.canada.ca/en/employment-social-development/corporate/seniors/forum/aging-checklist.html.

5. Denoising UWB Radar Data for Human Activity Recognition Using Convolutional Autoencoders;Lafontaine;IEEE Access,2023

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3