Transparency in Measurement Reporting

Author:

Aeschbach Lena Fanya1,Perrig Sebastian A.C.1,Weder Lorena1,Opwis Klaus1,Brühlmann Florian1

Affiliation:

1. Universität Basel, Basel, Switzerland

Abstract

Measuring theoretical concepts, so-called constructs, is a central challenge of Player Experience research. Building on recent work in HCI and psychology, we conducted a systematic literature review to study the transparency of measurement reporting. We accessed the ACM Digital Library to analyze all 48 full papers published at CHI PLAY 2020, of those, 24 papers used self-report measurements and were included in the full review. We assessed specifically, whether researchers reported What, How and Why they measured. We found that researchers matched their measures to the construct under study and that administrative details, such as number of points on a Likert-type scale, were frequently reported. However, definitions of the constructs to be measured and justifications for selecting a particular scale were sparse. Lack of transparency in these areas threaten the validity of singular studies, but further compromise the building of theories and accumulation of research knowledge in meta-analytic work. This work is limited to only assessing the current transparency of measurement reporting at CHI PLAY 2020, however we argue this constitutes a fair foundation to assess potential pitfalls. To address these pitfalls, we propose a prescriptive model of a measurement selection process, which aids researchers to systematically define their constructs, specify operationalizations, and justify why these measures were chosen. Future research employing this model should contribute to more transparency in measurement reporting. The research was funded through internal resources. All materials are available on https://osf.io/4xz2v/.

Publisher

Association for Computing Machinery (ACM)

Subject

Computer Networks and Communications,Human-Computer Interaction,Social Sciences (miscellaneous)

Reference57 articles.

Cited by 13 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. What I Don’t Like about You?: A Systematic Review of Impeding Aspects for the Usage of Conversational Agents;Interacting with Computers;2024-05-31

2. "Never The Same": Systematic Analysis of the Methodological Issues in the Presence Studies That Employ Questionnaires;Extended Abstracts of the CHI Conference on Human Factors in Computing Systems;2024-05-11

3. Experimenting with new review methods, open practices, and interactive publications in HCI;Extended Abstracts of the CHI Conference on Human Factors in Computing Systems;2024-05-11

4. Self-Efficacy and Security Behavior: Results from a Systematic Review of Research Methods;Proceedings of the CHI Conference on Human Factors in Computing Systems;2024-05-11

5. Independent Validation of the Player Experience Inventory: Findings from a Large Set of Video Game Players;Proceedings of the CHI Conference on Human Factors in Computing Systems;2024-05-11

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3