[PDF][PDF] Agree or disagree? A demonstration of an alternative statistic to Cohen's Kappa for measuring the extent and reliability of agreement between observers

Q Xie - Proceedings of the Federal Committee on Statistical …, 2013 - nces.ed.gov
Agreement analysis is an important tool that has been widely used in medical, social,
biological, physical and behavioral sciences. Though there are many different ways of …

Estimation of symmetric disagreement using a uniform association model for ordinal agreement data

S Aktaş, T Saraçbaşı - AStA Advances in Statistical Analysis, 2009 - Springer
The Cohen kappa is probably the most widely used measure of agreement. Measuring the
degree of agreement or disagreement in square contingency tables by two raters is mostly of …

Kappa—A critical review

L Xier - 2010 - diva-portal.org
The Kappa coefficient is widely used in assessing categorical agreement between two raters
or two methods. It can also be extended to more than two raters (methods). When using …

Hierarchical modeling of agreement

S Vanbelle, T Mutsvari, D Declerck… - Statistics in …, 2012 - Wiley Online Library
Kappa‐like agreement indexes are often used to assess the agreement among examiners
on a categorical scale. They have the particularity of correcting the level of agreement for the …

An estimating equations approach for modelling kappa

N Klar, SR Lipsitz, JG Ibrahim - Biometrical Journal: Journal of …, 2000 - Wiley Online Library
Agreement between raters for binary outcome data is typically assessed using the kappa
coefficient. There has been considerable recent work extending logistic regression to …

The effect of the raters' marginal distributions on their matched agreement: A rescaling framework for interpreting kappa

TM Karelitz, DV Budescu - Multivariate Behavioral Research, 2013 - Taylor & Francis
Cohen's κ measures the improvement in classification above chance level and it is the most
popular measure of interjudge agreement. Yet, there is considerable confusion about its …

Fleiss' kappa statistic without paradoxes

R Falotico, P Quatto - Quality & Quantity, 2015 - Springer
The Fleiss' kappa statistic is a well-known index for assessing the reliability of agreement
between raters. It is used both in the psychological and in the psychiatric field. Unfortunately …

Generalized estimating equations with model selection for comparing dependent categorical agreement data

MY Tsai, JF Wang, JL Wu - Computational Statistics & Data Analysis, 2011 - Elsevier
Many studies in biomedical fields are carried out using diagnoses reported by different
raters to evaluate the agreement of multiple ratings. The most popular indices of agreement …

[PDF][PDF] Disagreement on agreement: two alternative agreement coefficients

E Blood, KF Spratt - SAS Global Forum, 2007 - Citeseer
Everyone agrees there are problems with currently available agreement coefficients.
Cohen's weighted Kappa does not extend to multiple raters, and does not adjust for both …

Random marginal agreement coefficients: rethinking the adjustment for chance when measuring agreement

MP Fay - Biostatistics, 2005 - academic.oup.com
Agreement coefficients quantify how well a set of instruments agree in measuring some
response on a population of interest. Many standard agreement coefficients (eg kappa for …