New interpretations of Cohen's kappa

MJ Warrens - Journal of Mathematics, 2014 - Wiley Online Library
… 2 × 2 table we may calculate the kappa value. The value of a category kappa is a measure of
the agreement … The overall kappa is a weighted average of the m category kappas [15–17]. …

Interval estimation for Cohen's kappa as a measure of agreement

NJM Blackman, JJ Koval - Statistics in medicine, 2000 - Wiley Online Library
… detail in Table VI. We have shown that under the common correlation model, the normal
approximation in the distribution of Cohen's kappa is appropriate as long as true agreement is …

Interrater Agreement Measures: Comments on Kappan, Cohen's Kappa, Scott's π, and Aickin's α

LM Hsu, R Field - Understanding Statistics, 2003 - Taylor & Francis
agreement about base rates. Contrary to the views of recent critics of Cohen’s kappa, we argue
that Cohen’s kappa … the definition of the chance agreement rate for the (k × k) table is Σ{[Pi…

[PDF][PDF] The modified Cohen's kappa: Calculating interrater agreement for segmentation and annotation

H Holle, R Rein - Understanding body movements: A guide to …, 2013 - researchgate.net
… , which would result in a 7 X 7 contingency table for the agreement analysis. Because it is
impractical to calculate kappa for such large contingency tables (Kraemer, 1992), we perform a …

Meta-analysis of Cohen's kappa

S Sun - Health Services and Outcomes Research Methodology, 2011 - Springer
… of Cohen’s κ is very simple and can be done from a contingency table using a basic calculator.
The upper limit of κ is +1.00, occurring when and only when the two raters agree perfectly…

Measures of clinical agreement for nominal and categorical data: the kappa coefficient

L Cyr, K Francis - Computers in biology and medicine, 1992 - Elsevier
… The rating assignments assigned by the two raters are shown in Table 1. The results
matrix of agreement and non-agreement derived from Table 1 are illustrated in Table 2. …

Interrater reliability: the kappa statistic

ML McHugh - Biochemia medica, 2012 - hrcak.srce.hr
… less than perfect (1 .0) is a measure not only of agreement, but also of the reverse, disagree…
Table 3 can be simplified as follows: any kappa below 0 .60 indicates inadequate agreement

Cohen's kappa is a weighted average

MJ Warrens - Statistical Methodology, 2011 - Elsevier
… of an agreement table are nominal and the order in which the categories of a table are listed
is … As an example we consider the 4×4 agreement table presented in Table 1. There are five …

[PDF][PDF] A comparison of Cohen's kappa and agreement coefficients by Corrado Gini

MJ Warrens - … Research and Reviews in Applie d …, 2013 - scholarlypublications …
… In this paper we compare Cohen's kappa to three other agreement coefficients that have
been proposed in the literature. It … For each 34 data entries of Table 2 we observe the ordering …

Can one use Cohen's kappa to examine disagreement?

A von Eye, M von Eye - Methodology, 2005 - econtent.hogrefe.com
… use of Cohen’s κ (kappa), Brennan and Prediger’s κ n , and the coefficient of raw agreement
for … triangle of an agreement table, analogous to the ra coefficients in the previous sections. …