Abstract
Cohen's kappa is the most widely used descriptive measure of interrater agreement on a nominal scale. A measure that has repeatedly been proposed in the literature as an alternative to Cohen's kappa is Bennett, Alpert and Goldstein's S. The latter measure is equivalent to Janson and Vegelius' C and Brennan and Prediger's kappa(n). An agreement table can be collapsed into a table of smaller size by partitioning categories into subsets. The paper presents several results on how the overall S-value is related to the S-values of the collapsed tables.
It is shown that, if the categories are partitioned into subsets of the same size and if we consider all collapsed tables of this partition type, then the overall S-value is equivalent to the average S-value of the collapsed tables. This result illustrates that there are types of partitioning the categories that, on average, do not result in loss of information in terms of the S-value. In addition, it is proved that for all other partition types, the overall S-value is strictly smaller than the average S-value of the collapsed tables. A consequence is that there is always at least one way to combine categories such that the S-value increases. The S-value increases if we combine categories on which there exists considerable disagreement. (C) 2011 Elsevier B.V. All rights reserved.
| Original language | English |
|---|---|
| Pages (from-to) | 341-352 |
| Number of pages | 12 |
| Journal | Statistical Methodology |
| Volume | 9 |
| Issue number | 3 |
| DOIs | |
| Publication status | Published - May-2012 |
| Externally published | Yes |
Keywords
- Interrater reliability
- Nominal agreement
- Merging categories
- Bennett, Alpert and Goldstein's S
- Brennan and Prediger's kappa(n)
- Janson and Vegelius' C
- Janes' RE
- Cohen's kappa
- Cauchy-Schwarz inequality
- KAPPA-TYPE INDEXES
- COHENS KAPPA
- INTERRATER AGREEMENT
- 2 PARADOXES
- 2X2 TABLES
- COEFFICIENT
- RELIABILITY
- ASSOCIATION
- INEQUALITIES
- SYSTEM