Rasch model estimation

Estimation of a Rasch model is used to estimate the parameters of the Rasch model. Various techniques are employed to estimate the parameters from matrices of response data. The most common approaches are types of maximum likelihood estimation, such as joint and conditional maximum likelihood estimation. Joint maximum likelihood (JML) equations are efficient, but inconsistent for a finite number of items, whereas conditional maximum likelihood (CML) equations give consistent and unbiased item estimates. Person estimates are generally thought to have bias associated with them, although weighted likelihood estimation methods for the estimation of person parameters reduce the bias.

Rasch model

The Rasch model for dichotomous data takes the form:


\Pr \{X_{ni}=1\}=\frac{\exp({\beta_n} - {\delta_i})}{1 + \exp({\beta_n} - {\delta_i})},

where  \beta_n is the ability of person  n and  \delta_i is the difficulty of item  i .

Joint maximum likelihood

Let x_{ni} denote the observed response for person n on item i. The probability of the observed data matrix, which is the product of the probabilities of the individual responses, is given by the likelihood function


\Lambda = \frac{\prod_{n} \prod_{i} \exp(x_{ni}(\beta_n-\delta_i))}{\prod_{n} \prod_{i}(1+\exp(\beta_n-\delta_i))}.

The log-likelihood function is then


\log \Lambda = \sum_n^N \beta_n r_n - \sum_i^I \delta_i s_i - \sum_n^N \sum_i^I \log(1+\exp(\beta_n-\delta_i))

where r_n=\sum_i^I x_{ni} is the total raw score for person n, s_i=\sum_n^N x_{ni} is the total raw score for item i, N is the total number of persons and I is the total number of items.

Solution equations are obtained by taking partial derivatives with respect to \delta_i and \beta_n and setting the result equal to 0. The JML solution equations are:


s_i = \sum_n^N p_{ni},\quad i=1,\dots,I

r_n = \sum_i^I p_{ni},\quad n=1,\dots,N

where p_{ni}=\exp(\beta_n-\delta_i)/(1+\exp(\beta_n-\delta_i)). A more accurate estimate of each \delta_i is obtained by multiplying the estimates by (I-1)/I.

Conditional maximum likelihood

The conditional likelihood function is defined as


\Lambda = \prod_{n} \Pr\{(x_{ni})\mid r_n\} =\frac{\exp(\sum_i -s_i\delta_i)}{\prod_{n} \gamma_r}

in which


\gamma_r = \sum_{(x) \mid r}\exp(-\sum_i x_{ni}\delta_i)

is the elementary symmetric function of order r, which represents the sum over all combinations of r items. For example, in the case of three items,

\gamma_2 = \exp(-\delta_1-\delta_2)+\exp(-\delta_1-\delta_3)+\exp(-\delta_2-\delta_3).

Estimation algorithms

Some kind of expectation-maximization algorithm is used in the estimation of the parameters of Rasch models. Algorithms for implementing Maximum Likelihood estimation commonly employ Newton-Raphson iterations to solve for solution equations obtained from setting the partial derivatives of the log-likelihood functions equal to 0. Convergence criteria are used to determine when the iterations cease. For example, the criterion might be that the mean item estimate changes by less than a certain value, such as 0.001, between one iteration and another for all items.

See also

References

This article is issued from Wikipedia - version of the Tuesday, May 27, 2014. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.