Abstract
Abstract
Objective
To identify factors that influence response processes for patients providing quantitative self-report data. Secondly, due to the lack of integrative and explanatory models in this area, to develop a model of patients’ response processes that can guide what to look for when considering validity evidence and interpreting scores on individual items.
Methods
Participants (n = 13) were recruited from a specialized substance use disorder treatment clinic and interviewed while responding to items from a clinical feedback system implemented for routine outcome monitoring in that setting. The interview approach was based on cognitive interviewing. Data collection and analysis were inspired by a grounded theory approach.
Results
We identified several variables that influenced the participants’ response processes. The variables were organized into five categories: context-related variables; item-related variables; response base variables; reasoning strategies; and response selection strategies. We also found that the participants’ responses for many items were affected by different aspects of the response process in ways that are relevant to interpretation but not necessarily discernible from the numerical scores alone, and we developed response categories to capture this.
Conclusion
The findings suggest that patients providing quantitative self-report data encounter conditions in the response process that challenge and influence their ability to convey meaning and accuracy. This results in responses that for many of the items reflect messages important for interpretation and follow-up, even if it does not appear from the numerical scores alone. The proposed model may be a useful tool when developing items, assessing validity, and interpreting responses.
Funder
Fakultet for medisin og helsevitenskap, Norges Teknisk-Naturvitenskapelige Universitet
NTNU Norwegian University of Science and Technology
Publisher
Springer Science and Business Media LLC
Reference28 articles.
1. Hawkins, M., Elsworth, G. R., & Osborne, R. H. (2018). Application of validity theory and methodology to patient-reported outcome measures (PROMs): Building an argument for validity. Quality of Life Research, 27(7), 1695–1710.
2. Zumbo, B. D., & Chan, E. K. H. (2014). Validity and validation in social, behavioral, and health sciences. Springer International Publishing. https://doi.org/10.1007/978-3-319-07794-9
3. American Educational Research Association, American Psychological Association, & National Council on Measurement in Education (Eds.). (2014). Standards for educational and psychological testing. Americal Educational Research Association.
4. Hubley, A. M., & Zumbo, B. D. (2017). Response processes in the context of validity: Setting the stage. In B. D. Zumbo & A. M. Hubley (Eds.), Understanding and investigating response processes in validation research (pp. 1–12). Springer International Publishing.
5. De Jong, K., Conijin, J. M., Gallagher, R. A. V., Reshetnikova, A. S., Heij, M., & Lutz, M. C. (2021). Using progress feedback to improve outcomes and reduce drop-out, treatment duration, and deterioration: A multilevel meta-analysis. Clinical Psychology Review, 85, 102002. https://doi.org/10.1016/j.cpr.2021.102002