Affiliation:
1. University of California San Diego, La Jolla, CA, USA
2. University of Nevada Las Vegas, Las Vegas, NV, USA
3. Gomoll Research & Design, Milwaukee, WI, USA
Abstract
All research, e.g., qualitative or quantitative, is concerned with the extent to which analyses can adequately describe the phenomena it seeks to describe. In qualitative research, we use internal validity checks like intercoder agreement to measure the extent to which independent researchers observe the same phenomena in data. Researchers report indices of agreement to serve as evidence of consistency and dependability of interpretations, and we do so to make claims about the trustworthiness of our research accounts. However, few studies report the methods of how multiple analysts developed alignment in their interpretation of data, a process that undergirds accounts of consistency, dependability, and trustworthiness. In this article, we review the issues and options around achieving intercoder agreement. Drawing on our experience from a longitudinal, team-based research project that required rapid cycles of qualitative data analysis, we reflect on the challenges we had achieving high intercoder agreement (which refer to as the perils). It was through these challenges that we developed a method that helps to foster shared ways of seeing data, and thus alignment in our interpretations of phenomena in data. In this article, we present this method as a tool for dyadic and team-based qualitative data analysis to facilitate reliable and consistent high-inference interpretations of data with multiple analysts.
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献