Abstract
The manual categorization of behavior from sensory observation data to facilitate further analyses is a very expensive process. To overcome the inherent subjectivity of this process, typically, multiple domain experts are involved, resulting in increased efforts for the labeling. In this work, we investigate whether social behavior and environments can automatically be coded based on uncontrolled everyday audio recordings by applying deep learning. Recordings of daily living were obtained from healthy young and older adults at randomly selected times during the day by using a wearable device, resulting in a dataset of uncontrolled everyday audio recordings. For classification, a transfer learning approach based on a publicly available pretrained neural network and subsequent fine-tuning was implemented. The results suggest that certain aspects of social behavior and environments can be automatically classified. The ambient noise of uncontrolled audio recordings, however, poses a hard challenge for automatic behavior assessment, in particular, when coupled with data sparsity.
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Reference50 articles.
1. Demiray, B., Luo, M., Tejeda-Padron, A., and Mehl, M.R. Sounds of Healthy Aging: Assessing Everyday Social and Cognitive Activity from Ecologically Sampled Ambient Audio Data. International Perspectives on Aging, 2020.
2. An active and socially integrated lifestyle in late life might protect against dementia;Fratiglioni;Lancet Neurol.,2004
3. Loneliness and Social Isolation as Risk Factors for Mortality;Holt-Lunstad;Perspect. Psychol. Sci.,2015
4. Eavesdropping on Happiness: Well-Being Is Related to Having Less Small Talk and More Substantive Conversations;Mehl;Psychol. Sci.,2010
5. Mehl, M.R., and Conner, T.S. Why researchers should think “real-time”: A cognitive rationale. Handbook of Research Methods for Studying Daily Life, 2012.
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献