Simultaneous estimation of intrarater and interrater agreement for multiple raters under order restrictions for a binary trait

H Lester Kirchner, JH Lemke - Statistics in medicine, 2002 - Wiley Online Library
It is valuable in many studies to assess both intrarater and interrater agreement. Most
measures of intrarater agreement do not adjust for unequal estimates of prevalence …

A note on interrater agreement

KF Hirji, MH Rosove - Statistics in Medicine, 1990 - Wiley Online Library
We investigate the properties of a measure of interrater agreement originally proposed by
Rogot and Goldberg. 1 Unlike commonly used measures, this measure not only adjusts for …

An empirical comparative assessment of inter-rater agreement of binary outcomes and multiple raters

M Konstantinidis, LW Le, X Gao - Symmetry, 2022 - mdpi.com
Background: Many methods under the umbrella of inter-rater agreement (IRA) have been
proposed to evaluate how well two or more medical experts agree on a set of outcomes. The …

[PDF][PDF] Concurrent Assessment of Interrater Agreement and Intrarater Reliability in the Case of Binary Data

MB Slater - 2006 - prism.ucalgary.ca
The aim of the thesis was to assess the performance of a probability model developed to
describe an agreement study in which two raters assess a sample of subjects on a binary …

Overall indices for assessing agreement among multiple raters

JH Jang, AK Manatunga, AT Taylor… - Statistics in …, 2018 - Wiley Online Library
The need to assess agreement exists in various clinical studies where quantifying inter‐rater
reliability is of great importance. Use of unscaled agreement indices, such as total deviation …

[PDF][PDF] Inter-rater reliability: dependency on trait prevalence and marginal homogeneity

K Gwet - Statistical Methods for Inter-Rater Reliability …, 2002 - Citeseer
Researchers have criticized chance-corrected agreement statistics, particularly the Kappa
statistic, as being very sensitive to raters' classification probabilities (marginal probabilities) …

Inference procedures for assessing interobserver agreement among multiple raters

M Altaye, A Dormer, N Klar - Biometrics, 2001 - Wiley Online Library
We propose a new procedure for constructing inferences about a measure of interobserver
agreement in studies involving a binary outcome and multiple raters. The proposed …

Assessing interrater agreement on binary measurements via intraclass odds ratio

I Locatelli, V Rousson - Biometrical Journal, 2016 - Wiley Online Library
Interrater agreement on binary measurements is usually assessed via Scott's π or Cohen's κ,
which are known to be difficult to interpret. One reason for this difficulty is that these …

[图书][B] Measures of interobserver agreement and reliability

MM Shoukri - 2003 - taylorfrancis.com
Agreement among at least two evaluators is an issue of prime importance to statisticians,
clinicians, epidemiologists, psychologists, and many other scientists. Measuring …

A hierarchical approach to inferences concerning interobserver agreement for multinomial data

A Donner, M Eliasziw - Statistics in medicine, 1997 - Wiley Online Library
We consider inference methods for interobserver agreement studies characterized by two
raters and several outcome categories that one can naturally combine to address a series of …