Kappa statistic
From Wikipedia, the free encyclopedia
A Kappa statistic is a measure of degree of nonrandom agreement between observers and/or measurements of a specific categorical variable.
Kappa values in medical research publications
Medical practice involves observation. The reliability of observations (seeing is believing) cannot be taken for granted since there is considerable subjective variability in making observations on even the simplest of events. The variability of observations in clinical medical practice is of two types:
1. Inter-observer variability: Two or more observers will report different observed experiences when exposed to the same event - inter-observer variability. The degree of difference is an important quantity in assessing whether an observation is reliably reproducible by a diverse group of observers - accuracy. An observation that has high inter-observer variability is less reliable than one that exhibits a low value.
2. Intra-observer variability: There can be variability in observed values when the same individual repeatedly reports the results of a fixed event, especially when blinded to previous experiences. Once again, reports of an event with high intra-observer variability are less reliable than those with low values - precision.
Ideally, observations must be both accurate and precise.
In clinical studies where the results of the study depend on observation and the reporting of observed values (typically studies involving assessment of imaging methods like x-rays, ultrasound and CT scans), the value of the study can be better assessed if the two measures of variability are evaluated and reported as a quantity (number). The kappa statistic is the value calculated from the observed results of observations and reported as a decimal (theoretically from 0 - 1). The smaller the kappa value (closer to "0") the less reliable the observation; a value of "1" implies perfect correlation and never happens in reality. To be considered significant, the kappa value should be greater than 0.6.
See: Cohen's Kappa, Fleiss' kappa