Cohen's kappa
id:
cohen-s-kappa-213-2546266
title:
Cohen's kappa
text:
Cohen's kappa coefficient is a statistic that is used to measure inter-rater reliability for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation, as κ takes into account the possibility of the agreement occurring by chance. There is controversy surrounding Cohen's kappa due to the difficulty in interpreting indices of agreement. Some researchers have suggested that it is conceptually simpler to evaluate disagreement betwe
brand slug:
wiki
category slug:
encyclopedia
description:
Statistic measuring inter-rater agreement for categorical items
original url:
https://en.wikipedia.org/wiki/Cohen%27s_kappa
date created:
2005-04-06T19:39:21Z
date modified:
2024-09-12T08:53:27Z
main entity:
{"identifier":"Q1107106","url":"https://www.wikidata.org/entity/Q1107106"}
image:
fields total:
13
integrity:
15