Abstract
Mimic vlogs – that is, a form of fictional web series that tell stories utilising a vlog format – draw on audience expectations to elicit a particular response. Mimic vlogs use the conventions of an authentic format to tell a story in a way that resembles a genre audiences know and trust, much like mockumentaries and other forms of parodies. It is integral to understand how viewers approach and understand these videos, particularly on a platform such as YouTube which hosts both amateur, professional, usergenerated, and professionally-produced content. Mimic vlogs constitute a small part of a much larger phenomenon of replica content online, such as deep fakes, cheap fakes, fake news, misinformation and disinformation. This exploratory paper draws on primary data from YouTube viewers to investigate what methods audience members use to identify video content. Participants watched and responded to a series of eight videos made up of both user-generated vlogs and fictional mimic vlogs to determine the elements viewers considered while categorising the videos. The approaches participants employed were frequently unreliable, with participants coming to different conclusions based on the same piece of information. Contributing factors to this effect included the viewers’ perceptions around authenticity, plausibility, and markers of quality in each video. The results of this research illustrate the ways in which audiences read texts in different ways. This is in line with Stuart Hall’s encoding and decoding theory (1980) and broader audience reception studies which suggest that audiences play a vital role in interpreting texts and their meaning. Consequently, this research shows how audiences are vulnerable to even low-stakes replica content online, in part, because of their decoding of textual elements.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Ranting in emotional public spheres: Publicizing participatory challenges on YouTube;Convergence: The International Journal of Research into New Media Technologies;2024-04