Affiliation:
1. University of British Columbia, Vancouver, Canada
Abstract
Previous work has shown that some user cognitive abilities relevant for processing information visualizations can be predicted from eye-tracking data. Performing this type of
user modeling
is important for devising visualizations that can detect a user's abilities and adapt accordingly during the interaction. In this article, we extend previous user modeling work by investigating for the first time
interaction data
as an alternative source to predict cognitive abilities during visualization processing when it is not feasible to collect eye-tracking data. We present an extensive comparison of user models based solely on eye-tracking data, on interaction data, as well as on a combination of the two. Although we found that eye-tracking data generate the most accurate predictions, results show that interaction data can still outperform a majority-class baseline, meaning that adaptation for interactive visualizations could be enabled even when it is not feasible to perform eye tracking, using solely interaction data. Furthermore, we found that interaction data can predict several cognitive abilities with better accuracy at the very beginning of the task than eye-tracking data, which are valuable for delivering adaptation early in the task. We also extend previous work by examining the value of multimodal classifiers combining interaction data and eye-tracking data, with promising results for some of our target user cognitive abilities. Next, we contribute to previous work by extending the type of visualizations considered and the set of cognitive abilities that can be predicted from either eye-tracking data and interaction data. Finally, we evaluate how noise in gaze data impacts prediction accuracy and find that retaining rather noisy gaze datapoints can yield equal or even better predictions than discarding them, a novel and important contribution for devising adaptive visualizations in real settings where eye-tracking data are typically noisier than in laboratory settings.
Funder
MITACS
Envision Sustainability Tools Inc
Publisher
Association for Computing Machinery (ACM)
Subject
Artificial Intelligence,Human-Computer Interaction
Cited by
25 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献