Abstract
Augmented Reality (AR) devices offer a rich, immersive experience that provides the user with a blend of the physical and the synthetic, digitally augmented world. This augmentation is made possible by the AR device's sensors, which are constantly absorbing information about both the user's physical environment (using sensors such as depth and photographic cameras) and about the users themselves (using eye gaze sensors, for example). This information presents both problems and unique solutions. For example, user eye gaze has been shown to be effective in creating efficient and usable local pairing methods between two or more devices and to provide salience in determining what visual data should not be included in raw sensor outputs.
Publisher
Association for Computing Machinery (ACM)
Reference17 articles.
1. Apple. Apple vision pro - cameras and sensors, 2023.
2. M. Argyle. The Psychology of Interpersonal Behaviour. Penguin Books Limited, 1994.
3. PrivacyScout: Assessing Vulnerability to Shoulder Surfing on Mobile Devices
4. N. Binetti, C. Harrison, A. Coutrot, and I. Johnston, Alan ad Mareschal. Pupil dilation as an index of preferred mutual gaze duration. R. Soc. open sci., 3(7), 2016.
5. BystandAR: Protecting Bystander Visual Data in Augmented Reality Systems