Affiliation:
1. University of Minnesota
Abstract
When two observers classify a sample of items using the same categorical scale, and when different disagreements are differentially weighted, the weighted Kappa ( Kw) by Cohen may serve as a measure of interobserver agreement. We propose a Kappa-based weighted measure ( Kws) of agreement on some specific category s, with Kw being a weighted average of all Kwss. Therefore, while Cohen's Kw is a summary measure of the overall agreement, the proposed Kws provides a measure of the extent to which the observers agree on the specific categories, with both measures being suitable for ordinal categories because of the weights being used. Statistical inferences for Kws and its unweighted counterpart are also discussed. A numerical example is provided.
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献