Author:
Chen Qijia,Cai Jie,Jacucci Giulio
Abstract
AbstractSocial Virtual Reality (VR) is growing in popularity and has drawn the attention of HCI academics. Social VR experiences harassment just like other online environments. The Trust System (TS) in VRChat, one of the most prominent social VR platforms, is designed to measure and indicate users’ trustworthiness in order to reduce toxicity in the platform. In this research, we analyzed data from “r/VRChat,” to understand how users perceive the system. We found that users interpret the system differently. Problems in its implementation cause distrust. The trust ranks, while intended to promote positive interactions, can actually lead to stereotyping and discourage communication between users of different ranks. The hierarchical structure within the ranks exacerbates discrimination and conflicts, particularly against the low-ranked users. We further discuss that trust ranks present challenges to newcomers and contribute to a competitive atmosphere that hinders the formation of less toxic norms. Finally, we provide implications for the future design of similar systems.
Funder
Research Council of Finland
University of Helsinki
Publisher
Springer Science and Business Media LLC