Rasch model estimation

From Wikipedia, the free encyclopedia

Various techniques are employed in order to estimate parameters of the Rasch model from matrices of response data. The most common approaches are methods of maximum likelihood estimation, such as joint and conditional maximum likelihood estimation. Joint maximum likelihood (JML) equations are efficient, but inconsistent for a finite number of items, whereas conditional maximum likelihood (CML) equations give consistent and unbiased item estimates. Person estimates generally have bias associated with them although weighted likelihood estimation methods for the estimation of person parameters reduce bias.

Contents

[edit] Rasch model

The Rasch model for dichotomous data takes the form:

\Pr \{X_{ni}=1\}=\frac{\exp({\beta_n} - {\delta_i})}{1 + \exp({\beta_n} - {\delta_i})},

where βn is the ability of person n and δi is the difficulty of item i.

[edit] Joint maximum likelihood

Let xni denote the observed response for person n on item i. The probability of the observed data matrix, which is the product of the probabilities of the individual responses, is given by the likelihood function

\Lambda = \frac{\prod_{n} \prod_{i} \exp(x_{ni}(\beta_n-\delta_i))}{\prod_{n} \prod_{i}(1+\exp(\beta_n-\delta_i))}.

The log-likelihood function is then

\log \Lambda = \sum_n^N \beta_n r_n - \sum_i^I \delta_i s_i - \sum_n^N \sum_i^I \log(1+\exp(\beta_n-\delta_i))

where r_n=\sum_i^I x_{ni} is the total raw score for person n, s_i=\sum_n^N x_{ni} is the total raw score for item i, N is the total number of persons and I is the total number of items.

Solution equations are obtained by taking partial derivatives with respect to δi and βn and setting the result equal to 0. The JML solution equations are:

s_i = \sum_{r=1}^{L-1} n_r p_{ni}   , i=1,...,I
r_n = \sum_i^I p_{ni}    , n=1,...,N

where pni = exp(βn − δi) / (1 + exp(βn − δi)). A more accurate estimate of each δi is obtained by multiplying the estimates by (I − 1) / I.

[edit] Conditional maximum likelihood

The conditional likelihood function is defined as

\Lambda = \prod_{n} \Pr\{(x_{ni})\mid r_n\} =\frac{\exp(\sum_i -s_i\delta_i)}{\prod_{n} \gamma_r}

in which

\gamma_r = \sum_{(x) \mid r}\exp(-\sum_i x_{ni}\delta_i)

is the elementary symmetric function of order r, which represents the sum over all combinations of r items. For example, in the case of three items, γ2 = exp( − δ1 − δ2) + exp( − δ1 − δ3) + exp( − δ2 − δ3).

[edit] Estimation algorithms

Some kind of expectation-maximization algorithm is used in the estimation of the parameters of Rasch models. Algorithms for implementing Maximum Likelihood estimation commonly employ Newton-Raphson iterations to solve for solution equations obtained from setting the partial derivatives of the log-likelihood functions equal to 0. Convergence criteria are used to determine when the iterations cease. For example, the criterion might be that the mean item estimate changes by less than a certain value, such as 0.001, between one iteration and another for all items.

[edit] See Also

expectation-maximization algorithm