A measure of agreement for interval or nominal multivariate observations

H Janson, U Olsson - Educational and Psychological …, 2001 - journals.sagepub.com
… overall chance-corrected interobserver agreement among the multivariate ratings of several
… of Cohen’s kappa coefficient is proposed. The generalized statistic accounts agreement for …

Interobserver and intraobserver variability in pH-impedance analysis between 10 experts and automated analysis

CM Loots, MP van Wijk, K Blondeau, K Dalby… - The Journal of …, 2012 - Elsevier
… Overall, 1242 liquid and mixed GER events were detected, 490 (42%) were scored by the
majority of observers, yielding moderate agreement (Cohen's kappa [κ] = 0.46). Intraclass co-…

Midwives' visual interpretation of intrapartum cardiotocographs: intra‐and interobserver agreement

D Devane, J Lalor - Journal of advanced nursing, 2005 - Wiley Online Library
… Inter-rater agreement in interpretation was assessed by cross-tabulating the two sets of raw
data obtained at time 1 and time 2 and computing Cohen's Kappa (κ). Intra-rater agreement

Interobserver agreement in the interpretation of anal intraepithelial neoplasia

A Lytwyn, IE Salit, J Raboud, W Chapman, T Darragh… - Cancer, 2005 - Wiley Online Library
… We evaluated the interobserver agreement for cytology and biopsy specimens obtained
from an adequate number of consecutive patients, starting from the first patient enrolled in the …

… of colorectal liver metastases using MRI and CT: impact of observer experience on diagnostic performance and inter-observer reproducibility with histopathological …

MH Albrecht, JL Wichmann, C Müller… - European journal of …, 2014 - Elsevier
… used to analyse the inter-observer agreement between the four reviewers. Cohen's kappa
coefficients (κ) were calculated to assess the inter-observer agreement regarding the segment-…

[HTML][HTML] Accuracy and interobserver agreement between MR-non-expert radiologists and MR-experts in reading MRI for suspected appendicitis

MMN Leeuwenburgh, BM Wiarda, S Jensch… - European journal of …, 2014 - Elsevier
Interobserver variability between the two MR-non-expert readings and the MR-expert reading
was expressed in percentage observed agreement and by calculating Cohen's kappa (κ), …

Interobserver agreement on dermoscopic features of pigmented basal cell carcinoma

K Peris, E Altobelli, A Ferrari, MC Fargnoli… - Dermatologic …, 2002 - Wiley Online Library
… Global agreement and pairwise interobserver agreement on the dermoscopic criteria
were estimated using Cohen's kappa statistic, with k≥ 0.40 indicating good agreement. …

[PDF][PDF] Kappa testi

S Kılıç - Journal of mood disorders, 2015 - academia.edu
… calculation, since κ takes into account the agreement occurring by chance. Cohen’s kappa
measures agreement between two raters only but Fleisskappa is used when there are more …

Confidence of expert ultrasound operators in making a diagnosis of adnexal tumor: effect on diagnostic accuracy and interobserver agreement

J Yazbek, L Ameye, AC Testa, L Valentin… - … in Obstetrics and …, 2010 - Wiley Online Library
Interobserver agreement was evaluated with Cohen's kappa, kappa values of 0.81–1.0 …
with a high level of certainty (agreement rate 98–100%, Cohen's kappa 0.95–1.00), but was …

Intra and interobserver reliability and agreement of semiquantitative vertebral fracture assessment on chest computed tomography

CF Buckens, PA de Jong, C Mol, E Bakker… - PloS one, 2013 - journals.plos.org
… Intra- and interobserver agreement (absolute agreement or 95% Limits of Agreement) and
reliability (Cohen's kappa or intraclass correlation coefficient(ICC)) were calculated for the …