Beruflich Dokumente
Kultur Dokumente
Definitions and
Interpretations
Kappa statistic
Kappa is calculated as the observed agreement beyond chance (75% 50% = 25%)
divided by the maximum agreement beyond chance (100% 50% = 50%).
Example 1
Example 2: > 0
Prevalence class is 5%.
Classifier accuracy is 99%.
Kappa = (99-5)/(100-5) = 0.96
Max kappa is 1
Example 3: = 0
Prevalence class is 90%.
Classifier accuracy is 90%.
No accuracy beyond chance.
Kappa = (90-90)/(100-90) = 0
Example 4: < 0
Prevalence class is 90%.
Classifier accuracy is 10%.
Accuracy worse than chance.
Kappa = (10-90)/(100-90) = -8
1
ti oi
n i 1
1 n
ti oi
n i 1
t o
i 1
n
t
i 1
t o
i 1
n
t t
i 1
Accuracy by class
Predicted class
True class
positive
negative
positive (#P)
#TP
#FN = #P - #TP
negative (#N)
#FP
#TN = #N - #FP