Stats: What is a Kappa coefficient? (Cohen's Kappa)
The disagreeable behaviour of the kappa statistic - Flight - 2015 - Pharmaceutical Statistics - Wiley Online Library
Pitfalls in the use of kappa when interpreting agreement between multiple raters in reliability studies
PDF) Kappa statistic to measure agreement beyond chance in free-response assessments
High Agreement and High Prevalence: The Paradox of Cohen's Kappa
The kappa statistic
A Typology of 22 Inter-coder Reliability Indices Adjusted for chance... | Download Table
Measuring Inter-coder Agreement - ATLAS.ti
PDF) Explaining the unsuitability of the kappa coefficient in the assessment and comparison of the accuracy of thematic maps obtained by image classification (2020) | Giles M. Foody | 87 Citations
PDF) Bias, Prevalence and Kappa
The comparison of kappa and PABAK with changes of the prevalence of the... | Download Scientific Diagram
PDF] More than Just the Kappa Coefficient: A Program to Fully Characterize Inter-Rater Reliability between Two Raters | Semantic Scholar
All about DAG_Stat
Agree or Disagree? A Demonstration of An Alternative Statistic to Cohen's Kappa for Measuring the Extent and Reliability of Ag
PDF) Sequentially Determined Measures of Interobserver Agreement (Kappa) in Clinical Trials May Vary Independent of Changes in Observer Performance
PDF] The kappa statistic in reliability studies: use, interpretation, and sample size requirements. | Semantic Scholar
PDF) A Formal Proof of a Paradox Associated with Cohen's Kappa
Why Cohen's Kappa should be avoided as performance measure in classification | PLOS ONE
Kappa statistic | CMAJ
The disagreeable behaviour of the kappa statistic - Flight - 2015 - Pharmaceutical Statistics - Wiley Online Library
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
2 Agreement Coefficients for Nominal Ratings: A Review
On population-based measures of agreement for binary classifications
MASTER'S THESIS
Agree or Disagree? A Demonstration of An Alternative Statistic to Cohen's Kappa for Measuring the Extent and Reliability of Ag
free-marginal multirater/multicategories agreement indexes and the K categories PABAK - Cross Validated
PDF] More than Just the Kappa Coefficient: A Program to Fully Characterize Inter-Rater Reliability between Two Raters | Semantic Scholar
PDF) Beyond kappa: A review of interrater agreement measures | Michelle Capozzoli - Academia.edu
Agree or Disagree? A Demonstration of An Alternative Statistic to Cohen's Kappa for Measuring the Extent and Reliability of Ag