Reliability assessment for remote sensing data: beyond Cohen's kappa
GHG Kerr, C Fischer, R Reulke - 2015 IEEE International …, 2015 - ieeexplore.ieee.org
… As pointed out in [16], it is furthermore scaled from zero (pure chance agreement) to one (no
chance agreement) while Cohen’s κcoefficient values depends on several parameters and …
chance agreement) while Cohen’s κcoefficient values depends on several parameters and …
Interobserver reliability in physical examination of the cervical spine in patients with headache
HA Van Suijlekom, HCW De Vet… - … : The Journal of …, 2000 - Wiley Online Library
… Reliability was assessed by Cohen's kappa. … the agreement between positive or negative
results. Furthermore, it is important to realize that kappa is a measure of agreement between …
results. Furthermore, it is important to realize that kappa is a measure of agreement between …
[HTML][HTML] Interobserver agreement between senior radiology resident, neuroradiology fellow, and experienced neuroradiologist in the rating of Alberta Stroke Program …
C Kobkitsuksakul, O Tritanon… - Diagnostic and …, 2018 - ncbi.nlm.nih.gov
… found that narrowing the window improved the interobserver agreement. This has important
… Interobserver agreement assessment with Cohen’s κ showed better correlation between the …
… Interobserver agreement assessment with Cohen’s κ showed better correlation between the …
Comparison of the interobserver reproducibility with different histologic criteria used in celiac disease
GR Corazza, V Villanacci, C Zambelli, M Milione… - Clinical …, 2007 - Elsevier
… The aim of the study was to asses the interobserver agreement between different … with the
Cohen kappa statistic, which is commonly used to evaluate the degree of agreement beyond …
Cohen kappa statistic, which is commonly used to evaluate the degree of agreement beyond …
Validity inferences from interobserver agreement.
JS Uebersax - Psychological Bulletin, 1988 - psycnet.apa.org
… Methods for measuring rater agreement and making inferences about the accuracy of …
agreement on presence and absence of a trait separately, which the kappa coefficient does not do. …
agreement on presence and absence of a trait separately, which the kappa coefficient does not do. …
A dedicated BI-RADS training programme: effect on the inter-observer variation among screening radiologists
JMH Timmers, HJ van Doorne-Nagtegaal… - European journal of …, 2012 - Elsevier
… Cohen's kappa (κ) was used to calculate the inter-observer agreement. The BI-RADS
2003 version was implemented in the screening programme as the BI-RADS 2008 version …
2003 version was implemented in the screening programme as the BI-RADS 2008 version …
Interobserver agreement using histological scoring of the canine liver
JA Lidbury, A Rodrigues Hoffmann… - Journal of Veterinary …, 2017 - Wiley Online Library
… 1 study found a lack of interobserver agreement in the morphologic … To our knowledge,
interobserver agreement associated with … Cohen's kappa statistic (κ) frequently is used to estimate …
interobserver agreement associated with … Cohen's kappa statistic (κ) frequently is used to estimate …
Interobserver agreement and accuracy of preoperative endoscopic ultrasound-guided biopsy for histological grading of pancreatic cancer
A Larghi, L Correale, R Ricci, I Abdulkader… - …, 2015 - thieme-connect.com
… was to assess the interobserver agreement and accuracy of … Agreement among pathologists
for grading of preoperative … This appears to be due to suboptimal interobserver agreement …
for grading of preoperative … This appears to be due to suboptimal interobserver agreement …
Inter-observer agreement in endoscopic scoring systems: preliminary report of an ongoing study from the Italian Group for Inflammatory Bowel Disease (IG-IBD)
M Daperno, M Comberlato, F Bossa, L Biancone… - Digestive and Liver …, 2014 - Elsevier
… Agreement measures: the Fleiss kappa values and intra-class correlation coefficients
with 95% confidence intervals are reported for the expert and non-expert subgroups for the …
with 95% confidence intervals are reported for the expert and non-expert subgroups for the …
Rate of observation and inter-observer agreement for LI-RADS major features at CT and MRI in 184 pathology proven hepatocellular carcinomas
EC Ehman, SC Behr, SE Umetsu, N Fidelman… - Abdominal …, 2016 - Springer
… Inter-observer agreement of categorical data was evaluated using Cohen’s kappa statistic.
A Fisher’s exact test was used to compare the proportion of lesions demonstrating arterial …
A Fisher’s exact test was used to compare the proportion of lesions demonstrating arterial …