Affiliation:
1. Ural Federal University
2. N.N. Krasovskii Institute of Mathematics and Mechanics (IMM UB RAS)
Abstract
Interaction methods with virtual objects are an important issue for the development of VR. The user experience directly depends on the success of interactions with virtual objects and the environment. Therefore, the factors affecting the effectiveness of human-computer interaction has become a hotspot in this field. The paper describes an experiment focused on the influence of different visibility conditions of the interaction tool and the highlighting of a virtual object of interaction (with indirect interaction). We present the conditions, methodology and results of this study. The influence of different levels of abstraction of objects is considered: from geometric bodies (balls and parallelepipeds) to cartoon fruits, and photo realistic objects after then. For these conditions, a classical two-factor experiment is carried out, in which the first factor is the visibility of the control beam, and the second is the visibility of the highlighting of the captured object. The influence of one of these factors or their combination on the success of the movement of the captured object is expected. The success is determined by the time of movement (the less time, the more successful) and the optimality of the trajectory. The formula is proposed that determines the optimality of the trajectory. It is surprising that the movement of an already captured object is not affected by any of the above factors. Apparently, it is necessary to carefully study the capture process itself, not including the post-capture movement in the calculation.
Publisher
Keldysh Institute of Applied Mathematics
Reference11 articles.
1. I. Starodubtsev, V. Averbukh, D. Tobolin, N. Averbukh, Professional Natural Interfaces for Medicine Applications, in: C. Stephanidis (Ed.), HCI International 2014 - Posters’ Extended Abstracts, volume 435 of Communications in Computer and Information Science, Springer International Publishing, 2014, pp. 435–439.
2. I. Starodubtsev, M. Pestova, Development of specialized gesture interfaces for the scientific visualization system, in: 26th International Conference GraphiCon2016, Nizhny Novgorod, 2016, pp. 369–373.
3. D. W. Carruth, Virtual reality for education and workforce training, in: 2017 15th International Conference on Emerging eLearning Technologies and Applications (ICETA), 2017, pp. 1–6. doi:10.1109/ICETA.2017.8102472.
4. S. Soutschek, J. Penne, J. Hornegger, J. Kornhuber, 3-D gesture-based scene navigation in medical imaging applications using Time-of-Flight cameras, in: Computer Vision and Pattern Recognition Workshops, 2008. CVPRW ’08. IEEE Computer Society Conference on, 2008, pp. 1–6.
5. Y. Sato, M. Saito, H. Koike, Real-Time Input of 3D Pose and Gestures of a User’s Hand and Its Applications for HCI, in: Proceedings of IEEE Conference on Virtual Reality, 2001, pp. 79–86.