Towards Efficient Emotion Self-report Collection Using Human-AI Collaboration

Author:

Prajwal M.1ORCID,Raj Ayush2ORCID,Sen Sougata2ORCID,Saha Snehanshu3ORCID,Ghosh Surjya2ORCID

Affiliation:

1. Department of Electrical & Electronics Engineering, BITS Pilani Goa, India

2. Department of Computer Science & Information Systems, BITS Pilani Goa, India

3. APPCAIR, Department of Computer Science & Information Systems, BITS Pilani Goa, and HappyMonk AI Labs, India

Abstract

Emotion-aware services are increasingly used in different applications such as gaming, mental health tracking, video conferencing, and online tutoring. The core of such services is usually a machine learning model that automatically infers its user's emotions based on different biological indicators (e.g., physiological signals and facial expressions). However, such machine learning models often require a large number of emotion annotations or ground truth labels, which are typically collected as manual self-reports by conducting long-term user studies, commonly known as Experience Sampling Method (ESM). Responding to repetitive ESM probes for self-reports is time-consuming and fatigue-inducing. The burden of repetitive self-report collection leads to users responding arbitrarily or dropping out from the studies, compromising the model performance. To counter this issue, we, in this paper, propose a Human-AI Collaborative Emotion self-report collection framework, HACE, that reduces the self-report collection effort significantly. HACE encompasses an active learner, bootstrapped with a few emotion self-reports (as seed samples), and enables the learner to query for only not-so-confident instances to retrain the learner to predict the emotion self-reports more efficiently. We evaluated the framework in a smartphone keyboard-based emotion self-report collection scenario by performing a 3-week in-the-wild study (N = 32). The evaluation of HACE on this dataset (≈11,000 typing sessions corresponding to more than 200 hours of typing data) demonstrates that it requires 46% fewer self-reports than the baselines to train the emotion self-report detection model and yet outperforms the baselines with an average self-report detection F-score of 85%. These findings demonstrate the possibility of adopting such a human-AI collaborative approach to reduce emotion self-report collection efforts.

Publisher

Association for Computing Machinery (ACM)

Subject

Computer Networks and Communications,Hardware and Architecture,Human-Computer Interaction

Reference80 articles.

1. Leveraging Active Learning and Conditional Mutual Information to Minimize Data Annotation in Human Activity Recognition

2. OCEAN: Towards Developing an Opportunistic Continuous Emotion Annotation Framework

3. Margin Based Active Learning

4. Björn Barz , Christoph Käding , and Joachim Denzler . 2018 . Information-theoretic active learning for content-based image retrieval . In German Conference on Pattern Recognition. Springer, 650--666 . Björn Barz, Christoph Käding, and Joachim Denzler. 2018. Information-theoretic active learning for content-based image retrieval. In German Conference on Pattern Recognition. Springer, 650--666.

5. The Experience Sampling Method on Mobile Devices;Berkel Niels Van;ACM Computing Surveys (CSUR),2017

Cited by 2 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. HAIGEN: Towards Human-AI Collaboration for Facilitating Creativity and Style Generation in Fashion Design;Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies;2024-08-22

2. Towards Estimating Missing Emotion Self-reports Leveraging User Similarity: A Multi-task Learning Approach;Proceedings of the CHI Conference on Human Factors in Computing Systems;2024-05-11

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3