User:Lukeinusa
From Wikipedia, the free encyclopedia
Template:Title=A' A' (pronounced /a/-prime) is a statistic used in in signal detection theory.
It is an estimate of the area under the receiver operating characteristic (ROC) curve and thus lies in the range 0 () to 1.0 (perfect detection). A' = 0.5 indicates a random (50 / 50) likelihood of detecting a signal in noise. It is commonly used in psychophysics when exact calculation of the ROC curve is not possible.
A' is related to the sensitivity index, d', which measures the separation between the means of the signal and noise distributions in units of the standard deviation of the noise distribution. When the ROC curve is assumed to have slope not equal to 1.0, A’ is generally a more accurate estimate of area than d’ is of distance.[1]
[edit] See also
[edit] References
- ^ Donaldson, Wayne "Accuracy of d’ and A’ as estimates of sensitivity". Bulletin on the Psychonomic Society 1993, 31(4), 271-2 74 doi:1994-04142-001
- Macmillan, N. A., & Creelman, C. D. "Detection theory: A user’s guide". Cambridge: Cambridge University Press. 1991 (ISBN 0805842314)