Affiliation:
1. University of Virginia, Charlottesville, VA, USA
Abstract
Wearable devices allow quick and convenient interactions for controlling mobile computers. However, these interactions are often device-dependent, and users cannot control devices in a way they are familiar with if they do not wear the same wearable device. This paper proposes a new method, UnifiedSense, to enable device-dependent gestures even when the device that detects such gestures is missing by utilizing sensors on other wearable devices. UnifiedSense achieves this without explicit gesture training for different devices, by training its recognition model while users naturally perform gestures. The recognizer uses the gestures detected on the primary device (i.e., a device that reliably detects gestures) as labels for training samples and collects sensor data from all other available devices on the user. We conducted a technical evaluation with data collected from 15 participants with four types of wearable devices. It showed that UnifiedSense could correctly recognize 5 gestures (5 gestures × 5 configurations) with an accuracy of 90.9% (SD = 1.9%) without the primary device present.
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Networks and Communications,Human-Computer Interaction,Social Sciences (miscellaneous)
Reference109 articles.
1. Typing on a Smartwatch for Smart Glasses
2. A Novel Accelerometer-Based Gesture Recognition System
3. The sound of one hand
4. Apple Inc. 2021. Apple -- iOS 15 -- Continuity. https://www.macrumors.com/guide/universal-control/. Online ; accessed Aug 2022 . Apple Inc. 2021. Apple -- iOS 15 -- Continuity. https://www.macrumors.com/guide/universal-control/. Online; accessed Aug 2022.
5. Daniel L Ashbrook . 2010. Enabling mobile microinteractions . Georgia Institute of Technology . Daniel L Ashbrook. 2010. Enabling mobile microinteractions. Georgia Institute of Technology.