Abstract
AbstractResidual visual capabilities and the associated phenomenological experience can differ significantly between persons with similar visual acuity and similar diagnosis. There is a substantial variance in situations and tasks that persons with low vision find challenging. Smartglasses provide the opportunity of presenting individualized visual feedback targeted to each user’s requirements. Here, we interviewed nine persons with low vision to obtain insight into their subjective perceptual experience associated with factors such as illumination, color, contrast, and movement, as well as context factors. Further, we contribute a collection of everyday activities that rely on visual perception as well as strategies participants employ in their everyday lives. We find that our participants rely on their residual vision as the dominant sense in many different everyday activities. They prefer vision to other modalities if they can perceive the information visually, which highlights the need for assistive devices with visual feedback.
Publisher
Springer International Publishing
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Multi-Modal Interactions of Mixed Reality Framework;2024 IEEE 17th Dallas Circuits and Systems Conference (DCAS);2024-04-19
2. Customizable Multi-Modal Mixed Reality Framework;2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW);2024-03-16
3. MR-Sense: A Mixed Reality Environment Search Assistant for Blind and Visually Impaired People;2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR);2024-01-17
4. BrailleBuddy: A Tangible User Interface to Support Children with Visual Impairment in Learning Braille;Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems;2023-04-19
5. A Dataset and Machine Learning Approach to Classify and Augment Interface Elements of Household Appliances to Support People with Visual Impairment;Proceedings of the 28th International Conference on Intelligent User Interfaces;2023-03-27