Beyond Kappa: Estimating Inter-Rater Agreement with Nominal Classifications
SourceJournal of Modern Applied Statistical Methods, 8, 1, (2009), article 10
Article / Letter to editor
Display more detailsDisplay less details
FSW_Fac. algemeen RTOG
SW OZ BSI BO
Journal of Modern Applied Statistical Methods
Cohen’s Kappa and a number of related measures can all be criticized for their definition of correction for chance agreement. A measure is introduced that derives the corrected proportion of agreement directly from the data, thereby overcoming objections to Kappa and its related measures.
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.