Affiliation:
1. RTI International, 3040 Cornwallis Road, Research Triangle Park , NC 27709 , U.S.A.
2. University of Nebraska-Lincoln , 711 Oldfather Hall, Lincoln, NE 68588-0324 , U.S.A.
Abstract
Abstract
Providing an exact answer to open-ended numeric questions can be a burdensome task for respondents. Researchers often assume that adding an invitation to estimate (e.g., “Your best estimate is fine”) to these questions reduces cognitive burden, and in turn, reduces rates of undesirable response behaviors like item nonresponse, nonsubstantive answers, and answers that must be processed into a final response (e.g., qualified answers like “about 12” and ranges). Yet there is little research investigating this claim. Additionally, explicitly inviting estimation may lead respondents to round their answers, which may affect survey estimates. In this study, we investigate the effect of adding an invitation to estimate to 22 open-ended numeric questions in a mail survey and three questions in a separate telephone survey. Generally, we find that explicitly inviting estimation does not significantly change rates of item nonresponse, rounding, or qualified/range answers in either mode, though it does slightly reduce nonsubstantive answers for mail respondents. In the telephone survey, an invitation to estimate results in fewer conversational turns and shorter response times. Our results indicate that an invitation to estimate may simplify the interaction between interviewers and respondents in telephone surveys, and neither hurts nor helps data quality in mail surveys.
Reference34 articles.
1. Agresti A. 2002. Categorical Data Analysis, Second Edition, Hoboken, NJ: John Wiley & Sons.10.1002/0471249688
2. Bais, F., B. Schouten, P. Lugtig, V. Toepoel, J. Arends-Tòth, S. Douhou, N. Kieruj, M. Morren, and C. Vis. 2019. “Can Survey Item Characteristics Relevant to Measurement Error Be Coded Reliably? A Case Study on 11 Dutch General Population Surveys.” Sociological Methods & Research 48 (2): 263–95. DOI: https://doi.org/10.1177/0049124117729692.
3. Bassili, J.N., and B.S. Scott. 1996. “Response Latency as a Signal to Question Problems in Survey Research.” Public Opinion Quarterly 60 (3): 390–99. https://doi.org/10.1086/297760.
4. Beatty, P., and D. Herrmann. 2002. “To Answer or Not to Answer: Decision Processes Related to Survey Item Nonresponse.” In Survey Nonresponse, edited by R.M. Groves, D.A. Dillman, J.L. Eltinge, and R.J. Little: 71–85. New York: Wiley.
5. Bilgen, I., and R.F. Belli. 2010. “Comparison of Verbal Behaviors between Calendar and Standardized Conventional Questionnaires.” Journal of Official Statistics 26 (3): 481–505. Available at: https://www.scb.se/contentassets/ca21efb41fee47d293bbee5bf7be7fb3/comparison-of-verbal-behaviors-between-calendar-and-standardized-conventional-questionnaires.pdf.