[HTML][HTML] Gwet's AC1 is not a substitute for Cohen's kappa–A comparison of basic properties
Gwet's AC1 has been proposed as an alternative to Cohen's kappa in evaluating the
agreement between two binary ratings. This approach is becoming increasingly popular …
agreement between two binary ratings. This approach is becoming increasingly popular …
Statistical inference of Gwet's AC1 coefficient for multiple raters and binary outcomes
T Ohyama - Communications in Statistics-Theory and Methods, 2021 - Taylor & Francis
Cohen's kappa and intraclass kappa are widely used for assessing the degree of agreement
between two raters with binary outcomes. However, many authors have pointed out its …
between two raters with binary outcomes. However, many authors have pointed out its …
Agreement between raters and groups of raters
S Vanbelle - 2009 - orbi.uliege.be
Agreement between raters on a categorical scale is not only a subject of scientific research
but also a problem frequently encountered in practice. Whenever a new scale is developed …
but also a problem frequently encountered in practice. Whenever a new scale is developed …
Conditional inequalities between Cohen's kappa and weighted kappas
MJ Warrens - Statistical Methodology, 2013 - Elsevier
Cohen's kappa and weighted kappa are two standard tools for describing the degree of
agreement between two observers on a categorical scale. For agreement tables with three …
agreement between two observers on a categorical scale. For agreement tables with three …
[HTML][HTML] New interpretations of Cohen's kappa
MJ Warrens - Journal of Mathematics, 2014 - hindawi.com
Cohen's kappa is a widely used association coefficient for summarizing interrater agreement
on a nominal scale. Kappa reduces the ratings of the two observers to a single number. With …
on a nominal scale. Kappa reduces the ratings of the two observers to a single number. With …
Statistical inference of agreement coefficient between two raters with binary outcomes
T Ohyama - Communications in Statistics-Theory and Methods, 2020 - Taylor & Francis
Scott's pi and Cohen's kappa are widely used for assessing the degree of agreement
between two raters with binary outcomes. However, many authors have pointed out its …
between two raters with binary outcomes. However, many authors have pointed out its …
Equivalences of weighted kappas for multiple raters
MJ Warrens - Statistical Methodology, 2012 - Elsevier
Cohen's unweighted kappa and weighted kappa are popular descriptive statistics for
measuring agreement between two raters on a categorical scale. With m≥ 3 raters, there …
measuring agreement between two raters on a categorical scale. With m≥ 3 raters, there …
Weighted kappa is higher than Cohen's kappa for tridiagonal agreement tables
MJ Warrens - Statistical Methodology, 2011 - Elsevier
Cohen's kappa and weighted kappa are two popular descriptive statistics for measuring
agreement between two observers on a nominal scale. It has been frequently observed in …
agreement between two observers on a nominal scale. It has been frequently observed in …
Cohen's kappa is a weighted average
MJ Warrens - Statistical Methodology, 2011 - Elsevier
The κ coefficient is a popular descriptive statistic for summarizing an agreement table. It is
sometimes desirable to combine some of the categories, for example, when categories are …
sometimes desirable to combine some of the categories, for example, when categories are …
Five ways to look at Cohen's kappa
MJ Warrens - Journal of Psychology & Psychotherapy, 2015 - research.rug.nl
The kappa statistic is commonly used for quantifying inter-rater agreement on a nominal
scale. In this review article we discuss five interpretations of this popular coefficient. Kappa is …
scale. In this review article we discuss five interpretations of this popular coefficient. Kappa is …