Response Burden and Dropout in a Probability-Based Online Panel Study – A Comparison between an App and Browser-Based Design

Author:

Roberts Caroline1,Herzing Jessica M.E.2,Manjon Marc Asensio1,Abbet Philip3,Gatica-Perez Daniel34

Affiliation:

1. Institute of Social Sciences , Faculty of Social and Political Sciences University of Lausanne , Bâtiment Géopolis, Quartier Mouline, 1015 Lausanne , Switzerland .

2. Interfaculty Centre for Educational Research (ICER) , University of Bern , Fabrikstrasse 8, 3012 Bern , Switzerland .

3. Idiap Research Institute , Rue Marconi 19, 1920 Martigny , Switzerland . E-mails: philip.abbet@idiap.ch

4. EPFL (Ecole polytechnique fédérale de Lausanne), School of Engineering and College of Humanities , Inn building, Station 14, 1015 Lausanne , Switzerland .

Abstract

Abstract Survey respondents can complete web surveys using different Internet-enabled devices (PCs versus mobile phones and tablets) and using different software (web browser versus a mobile software application, “app”). Previous research has found that completing questionnaires via a browser on mobile devices can lead to higher breakoff rates and reduced measurement quality compared to using PCs, especially where questionnaires have not been adapted for mobile administration. A key explanation is that using a mobile browser is more burdensome and less enjoyable for respondents. There are reasons to assume apps should perform better than browsers, but so far, there have been few attempts to assess this empirically. In this study, we investigate variation in experienced burden across device and software in wave 1 of a three-wave panel study, comparing an app with a browser-based survey, in which sample members were encouraged to use a mobile device. We also assess device/software effects on participation at wave 2. We find that compared to mobile browser respondents, app respondents were less likely to drop out of the study after the first wave and the effect of the device used was mediated by subjective burden experienced during wave 1.

Publisher

Walter de Gruyter GmbH

Reference97 articles.

1. Allum, N., F. Conrad, and A. Wenz. 2018. “Consequences of Mid-Stream Mode-Switching in a Panel Survey.” Survey Research Methods 12: 43–58. DOI: https://doi.org/10.18148/srm/2018.v12i1.6779.

2. Antoun, C. 2015. Who are the Internet Users, Mobile Internet Users, and Mobile-Mostly Internet Users?: Demographic Differences Across Internet-Use Subgroups in the U.S.” In Mobile Research Methods: Opportunities and Challenges of Mobile Research Methodologies by D. Toninelli, R. Pinter, and P. Pedraza: 99–117. Ubiquity Press. DOI: http://dx.doi.org/10.5334/bar.g.10.5334/bar.g

3. Antoun, C., and A. Cernat. 2020. “Factors Affecting Completion Times: A Comparative Analysis of Smartphone and PC Web Surveys.” Social Science Computer Review 38: 477–489. DOI: https://doi.org/10.1177/0894439318823703.

4. Antoun, C., F.G. Conrad, M.P. Couper, and T.B. West. 2019. “Simultaneous Estimation of Multiple Sources of Error in a Smartphone-Based Survey.” Journal of Survey Statistics and Methodology 7: 93–117. DOI: https://doi.org/10.1093/jssam/smy002.

5. Antoun, C., M.P. Couper, and F.G. Conrad. 2017. “Effects of Mobile versus PC Web on Survey Response Quality.” Public Opinion Quarterly 81: 280 – 306. DOI: https://doi.org/10.1093/poq/nfw088.

Cited by 2 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3