Your Best Estimate is Fine. Or is It?

Author:

Timbrook Jerry1,Olson Kristen2,Smyth Jolene D.2

Affiliation:

1. RTI International, 3040 Cornwallis Road, Research Triangle Park , NC 27709 , U.S.A.

2. University of Nebraska-Lincoln , 711 Oldfather Hall, Lincoln, NE 68588-0324 , U.S.A.

Abstract

Abstract Providing an exact answer to open-ended numeric questions can be a burdensome task for respondents. Researchers often assume that adding an invitation to estimate (e.g., “Your best estimate is fine”) to these questions reduces cognitive burden, and in turn, reduces rates of undesirable response behaviors like item nonresponse, nonsubstantive answers, and answers that must be processed into a final response (e.g., qualified answers like “about 12” and ranges). Yet there is little research investigating this claim. Additionally, explicitly inviting estimation may lead respondents to round their answers, which may affect survey estimates. In this study, we investigate the effect of adding an invitation to estimate to 22 open-ended numeric questions in a mail survey and three questions in a separate telephone survey. Generally, we find that explicitly inviting estimation does not significantly change rates of item nonresponse, rounding, or qualified/range answers in either mode, though it does slightly reduce nonsubstantive answers for mail respondents. In the telephone survey, an invitation to estimate results in fewer conversational turns and shorter response times. Our results indicate that an invitation to estimate may simplify the interaction between interviewers and respondents in telephone surveys, and neither hurts nor helps data quality in mail surveys.

Publisher

Walter de Gruyter GmbH

Reference34 articles.

1. Agresti A. 2002. Categorical Data Analysis, Second Edition, Hoboken, NJ: John Wiley & Sons.10.1002/0471249688

2. Bais, F., B. Schouten, P. Lugtig, V. Toepoel, J. Arends-Tòth, S. Douhou, N. Kieruj, M. Morren, and C. Vis. 2019. “Can Survey Item Characteristics Relevant to Measurement Error Be Coded Reliably? A Case Study on 11 Dutch General Population Surveys.” Sociological Methods & Research 48 (2): 263–95. DOI: https://doi.org/10.1177/0049124117729692.

3. Bassili, J.N., and B.S. Scott. 1996. “Response Latency as a Signal to Question Problems in Survey Research.” Public Opinion Quarterly 60 (3): 390–99. https://doi.org/10.1086/297760.

4. Beatty, P., and D. Herrmann. 2002. “To Answer or Not to Answer: Decision Processes Related to Survey Item Nonresponse.” In Survey Nonresponse, edited by R.M. Groves, D.A. Dillman, J.L. Eltinge, and R.J. Little: 71–85. New York: Wiley.

5. Bilgen, I., and R.F. Belli. 2010. “Comparison of Verbal Behaviors between Calendar and Standardized Conventional Questionnaires.” Journal of Official Statistics 26 (3): 481–505. Available at: https://www.scb.se/contentassets/ca21efb41fee47d293bbee5bf7be7fb3/comparison-of-verbal-behaviors-between-calendar-and-standardized-conventional-questionnaires.pdf.

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3