Sie sind auf Seite 1von 11

KAPPA ANALYSIS

Dr Rukman Mecca
Community Medicine Dept.
SETH GSMC & KEM Hospital Mumbai
first proposed by Cohen(1960)
to assess agreement between two or more equally
skilled observers
It is concerned with the observed agreement on top of chance
agreement.
The expected proportion of units where the observers give the same
results if they are assumed to act independently, is denoted by pe and
can be written: pe [(ab)(ac)(bd)(cd)]/n 2

Let po denote the observed proportion of units where


the two observers really give identical classifications,
po = (a + d ) / n
Then the kappa coefficient is defined as =(p0 pe)/
(1pe)
K = Proportion of observed agreement Proportion of
agreement by chance/ 1 Proportion of agreement by
Exercise 1
Observer A Observer B Total

147 3 150

10 62 72

157 65 222
Percentage agreement Po = 147+62/222 = 0.94
Expected afreement Pe = (157*150/222) + (65*72/222)
=0.57

Kappa , K = 0.94-0.57/ 1-0.57=0.84


If the two experts agreed at the level of chance only,
Kappa would be 0; if the two experts agreed perfectly
Kappa would be 1.
Kappa (k) Strength of
Agreement

Sackett el al (1991)
Kappa may be mislead when

1. There is low prevalence of the event of interest.


2. There may be biased in diagnostic procedures.

Das könnte Ihnen auch gefallen