Logit
From Wikipedia, the free encyclopedia
- The logit function is an important part of logistic regression: for more information, please see that article.
The logit function is the inverse of the "sigmoid", or "logistic" function used in mathematics, especially in statistics. The logit of a number p between 0 and 1 is given by the formula:
Logit is (pronounced /ˈloʊdʒɪt/ with a long "o" and a soft "g".
The base of the logarithm function used is of little importance in the present article, as long as it is greater than 1, but the natural logarithm with base e is the one most often used.
If p is a probability then p/(1 − p) is the corresponding odds, and the logit of the probability is the logarithm of the odds; similarly the difference between the logits of two probabilities is the logarithm of the odds ratio (R), thus providing a shorthand for writing the correct combination of odds-ratios only by adding and subtracting:
Contents |
[edit] History
The logit model was introduced by Joseph Berkson in 1944, who coined the term. The term was borrowed by analogy from the very similar probit model developed by Chester Ittner Bliss in 1934. G. A. Barnard in 1949 coined the commonly used term log-odds; the log-odds of an event is the logit of the probability of the event.
[edit] Uses and properties
- The logit in logistic regression is a special case of a link function in a generalized linear model: it is the canonical link function for the binomial distribution.
- The logit function is the negative of the derivative of the binary entropy function.
- The logit is also central to the probabilistic Rasch model for measurement, which has applications in psychological and educational assessment, among other areas.
[edit] See also
- Daniel McFadden, a Nobel Prize winner for development of a particular logit model used in economics
- Logit analysis in marketing
- Perceptron
- Probit