Affiliation:
1. Carnegie Mellon University, Pittsburgh, USA
Abstract
Pointing with one's finger is a natural and rapid way to denote an area or object of interest. It is routinely used in human-human interaction to increase both the speed and accuracy of communication, but it is rarely utilized in human-computer interactions. In this work, we use the recent inclusion of wide-angle, rear-facing smartphone cameras, along with hardware-accelerated machine learning, to enable real-time, infrastructure-free, finger-pointing interactions on today's mobile phones. We envision users raising their hands to point in front of their phones as a "wake gesture". This can then be coupled with a voice command to trigger advanced functionality. For example, while composing an email, a user can point at a document on a table and say "attach". Our interaction technique requires no navigation away from the current app and is both faster and more privacy-preserving than the current method of taking a photo.
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Networks and Communications,Human-Computer Interaction,Social Sciences (miscellaneous)
Reference65 articles.
1. TouchPose: Hand Pose Prediction, Depth Estimation, and Touch Classification from Capacitive Images
2. Accuracy of interpreting pointing gestures in egocentric view
3. Apple. 2016. builtInWideAngleCamera API. https://developer.apple.com/documentation/avfoundation/avcapturedevice/devicetype/2361449-builtinwideanglecamera Apple. 2016. builtInWideAngleCamera API. https://developer.apple.com/documentation/avfoundation/avcapturedevice/devicetype/2361449-builtinwideanglecamera
4. Apple. 2022. ARImageAnchor. https://developer.apple.com/documentation/arkit/arimageanchor Apple. 2022. ARImageAnchor. https://developer.apple.com/documentation/arkit/arimageanchor
5. Apple. 2022. Metal Framework. https://developer.apple.com/documentation/metal Apple. 2022. Metal Framework. https://developer.apple.com/documentation/metal