Skip to content

Latest commit

 

History

History
51 lines (41 loc) · 1.97 KB

Cohen's Kappa.md

File metadata and controls

51 lines (41 loc) · 1.97 KB

Cohen's Kappa

What is it?
Cohen's kappa coefficient is a statistic which measures inter-rater agreement for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation, since κ takes into account the possibility of the agreement occurring by chance.

What is inter-rater aggreement?
In statistics, inter-rater reliability, inter-rater agreement, or concordance is the degree of agreement among raters. It gives a score of how much homogeneity, or consensus, there is in the ratings

Example:

Actual Value Predicted Value
0 0
0 0
1 0
0 1
1 1
1 0
1 1
1 1
1 1
0 0

Cohen's Matrix

Actual / Predicted 0 (A) 1 (A) Total
0 (P) 3 2 5
1 (P) 1 4 5
Total 4 6 10

``` P(a) [ Observed Proportionate Agreement ] = Sum of Cohen's Matrix Diagonals / Total(Predicted + Actual) = (3+4) / 10 = 0.7
p(b) [ Probability of Random Agreement ] = ( (Total of Actual 0's * Total of Predicted 0's) + (Total of Actual 1's Total of Predicted 0's * Total of Predicted 1's ) ) / Total = ((4*5) + (6*5)) / 10 = 0.5
``` **Cohen's Kappa** = ( P(a) - P(b) ) / ( 1 - P(b) ) = ( 0.7 - 0.5 ) / ( 1 - 0.5 ) = 0.2/0.5 = 0.4

Kappa Coefficients Results
k >= 0.5 : Moderate Aggrement
k >= 0.7 : Good Aggrement
k >= 0.8 : Excellent Aggement

Kappa Sensitivity : True Positives
Kappa Specificity : False Negetives

References:

  1. Calculating Cohen's Kappa
  2. Cohen's Kapps
  3. Inter-rater Reliability
  4. Kappa in english