On the marginal dependency of Cohen's κ
A von Eye, M von Eye - European Psychologist, 2008 - econtent.hogrefe.com
Cohen's κ (kappa) is typically used as a measure of degree of rater agreement. It is often
criticized because it is marginal-dependent. In this article, this characteristic is explained and …
criticized because it is marginal-dependent. In this article, this characteristic is explained and …
Chance-corrected measures of reliability and validity in KK tables
AM Andrés, PF Marzo - Statistical methods in medical …, 2005 - journals.sagepub.com
When studying the degree of overall agreement between the nominal responses of two
raters, it is customary to use the coefficient kappa. A more detailed analysis requires the …
raters, it is customary to use the coefficient kappa. A more detailed analysis requires the …
Equivalences of weighted kappas for multiple raters
MJ Warrens - Statistical Methodology, 2012 - Elsevier
Cohen's unweighted kappa and weighted kappa are popular descriptive statistics for
measuring agreement between two raters on a categorical scale. With m≥ 3 raters, there …
measuring agreement between two raters on a categorical scale. With m≥ 3 raters, there …
Utility of weights for weighted kappa as a measure of interrater agreement on ordinal scale
M Heo - Journal of Modern Applied Statistical Methods, 2008 - jmasm.com
Kappa statistics, unweighted or weighted, are widely used for assessing interrater
agreement. The weights of the weighted kappa statistics in particular are defined in terms of …
agreement. The weights of the weighted kappa statistics in particular are defined in terms of …
Testing the difference of correlated agreement coefficients for statistical significance
KL Gwet - Educational and Psychological Measurement, 2016 - journals.sagepub.com
This article addresses the problem of testing the difference between two correlated
agreement coefficients for statistical significance. A number of authors have proposed …
agreement coefficients for statistical significance. A number of authors have proposed …
Testing the equality of two dependent kappa statistics
A Donner, MM Shoukri, N Klar, E Bartfay - Statistics in medicine, 2000 - Wiley Online Library
Procedures are developed and compared for testing the equality of two dependent kappa
statistics in the case of two raters and a dichotomous outcome variable. Such problems may …
statistics in the case of two raters and a dichotomous outcome variable. Such problems may …
[PDF][PDF] Modification in inter-rater agreement statistics-a new approach
S Iftikhar - J Med Stat Inform, 2020 - pdfs.semanticscholar.org
Assessing agreement between the examiners, measurements and instruments are always of
interest to health-care providers as the treatment of patients is highly dependent on the …
interest to health-care providers as the treatment of patients is highly dependent on the …
Exact one-sided confidence limits for Cohen's kappa as a measurement of agreement
G Shan, W Wang - Statistical methods in medical research, 2017 - journals.sagepub.com
Cohen's kappa coefficient, κ, is a statistical measure of inter-rater agreement or inter-
annotator agreement for qualitative items. In this paper, we focus on interval estimation of κ …
annotator agreement for qualitative items. In this paper, we focus on interval estimation of κ …
Robustness of ‐type coefficients for clinical agreement
A Vanacore, MS Pellegrino - Statistics in Medicine, 2022 - Wiley Online Library
The degree of inter‐rater agreement is usually assessed through κ‐type coefficients and the
extent of agreement is then characterized by comparing the value of the adopted coefficient …
extent of agreement is then characterized by comparing the value of the adopted coefficient …
Five ways to look at Cohen's kappa
MJ Warrens - Journal of Psychology & Psychotherapy, 2015 - research.rug.nl
The kappa statistic is commonly used for quantifying inter-rater agreement on a nominal
scale. In this review article we discuss five interpretations of this popular coefficient. Kappa is …
scale. In this review article we discuss five interpretations of this popular coefficient. Kappa is …