Large Language Models for Wearable Sensor-Based Human Activity Recognition, Health Monitoring, and Behavioral Modeling: A Survey of Early Trends, Datasets, and Challenges
Affiliation:
1. Thomas Lord Department of Computer Science, University of Southern California, Los Angeles, CA 90007, USA 2. Information Sciences Institute, School of Advanced Computing, University of Southern California, Los Angeles, CA 90007, USA
Abstract
The proliferation of wearable technology enables the generation of vast amounts of sensor data, offering significant opportunities for advancements in health monitoring, activity recognition, and personalized medicine. However, the complexity and volume of these data present substantial challenges in data modeling and analysis, which have been addressed with approaches spanning time series modeling to deep learning techniques. The latest frontier in this domain is the adoption of large language models (LLMs), such as GPT-4 and Llama, for data analysis, modeling, understanding, and human behavior monitoring through the lens of wearable sensor data. This survey explores the current trends and challenges in applying LLMs for sensor-based human activity recognition and behavior modeling. We discuss the nature of wearable sensor data, the capabilities and limitations of LLMs in modeling them, and their integration with traditional machine learning techniques. We also identify key challenges, including data quality, computational requirements, interpretability, and privacy concerns. By examining case studies and successful applications, we highlight the potential of LLMs in enhancing the analysis and interpretation of wearable sensor data. Finally, we propose future directions for research, emphasizing the need for improved preprocessing techniques, more efficient and scalable models, and interdisciplinary collaboration. This survey aims to provide a comprehensive overview of the intersection between wearable sensor data and LLMs, offering insights into the current state and future prospects of this emerging field.
Reference102 articles.
1. Ortiz, B.L. (2024). Data Preprocessing Techniques for Artificial Learning (AI)/Machine Learning (ML)-Readiness: Systematic Review of Wearable Sensor Data in Cancer Care. JMIR Mhealth Uhealth. 2. Multimodal machine learning in precision health: A scoping review;Kline;NPJ Digit. Med.,2022 3. Fang, C.M., Danry, V., Whitmore, N., Bao, A., Hutchison, A., Pierce, C., and Maes, P. (2024). PhysioLLM: Supporting Personalized Health Insights with Wearables and Large Language Models. arXiv. 4. Imran, S.A., Khan, M.N.H., Biswas, S., and Islam, B. (2024). LLaSA: Large Multimodal Agent for Human Activity Analysis through Wearable Sensors. arXiv. 5. TILES-2018, a longitudinal physiologic and behavioral data set of hospital workers;Mundnich;Sci. Data,2020
|
|