Affiliation:
1. University of Minnesota
Abstract
Cohen's Kappa is a measure of the over-all agreement between two raters classifying items into a given set of categories. This communication describes a simple computational method of determining the agreement on specific categories without the need to collapse the original data table as required by the previous Kappa-based method. It is also pointed out that Kappa may be formulated in terms of certain distance metrics. The computational procedure for the specific agreement measure is exemplified using hypothetical data from psychological diagnoses.
Cited by
94 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献