Weighted Kappa for Interobserver Agreement and Missing Data

Matthijs J. Warrens*, Alexandra de Raadt, Roel Bosker, H.A.L. Kiers

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

The weighted kappa coefficient is commonly used for assessing agreement between two raters on an ordinal scale. This study is the first to assess the impact of missing data on the value of weighted kappa. We compared three methods for handling missing data in a simulation study: predictive mean matching, listwise deletion and a weighted version of Gwet’s kappa. We compared their performances under three missing data mechanisms, using agreement tables with various numbers of categories and different values of weighted kappa. Predictive mean matching outperformed the other two methods in most simulated cases in terms of root mean squared error and in all cases in terms of bias.
Original languageEnglish
Article number18
Number of pages19
JournalMachine Learning & Knowledge Extraction
Volume7
Issue number1
DOIs
Publication statusPublished - Mar-2025

Fingerprint

Dive into the research topics of 'Weighted Kappa for Interobserver Agreement and Missing Data'. Together they form a unique fingerprint.

Cite this