Neutral vector

From Wikipedia, the free encyclopedia

In statistics, and specifically in the study of the Dirichlet distribution, a neutral vector of random variables is one that exhibits a particular type of statistical independence amongst its elements.[1]

Consider random variables X_1,\ldots,X_k where \sum_{i=1}^k X_i=1; interpret the Xi as lengths whose sum is unity. In a variety of contexts, it is often desirable to eliminate a proportion, say X1, and consider the distribution of the remaining interval. One then defines the first element of X, viz X1 as neutral if X1 is statistically independent of the vector X_2/(1-X_1),X_3/(1-X_1),\ldots,X_k/(1-X_1).

Variable X2 is neutral if X2 / (1 − X1) is independent of the remaining interval: that is, X2 / (1 − X1) being independent of X_3/(1-X_1-X_2),X_4/(1-X_1-X_2),\ldots,X_k/(1-X_1-X_2). Thus X2, viewed as the first element of Y=X_2,X_3,\ldots,X_k, is neutral.

In general, variable Xj is neutral if X_1,\ldots X_{j-1} is independent of X_{j+1}/(1-X_1-\cdots -X_j),\ldots X_k/(1-X_1-\cdots - X_j).

A vector for which each element is neutral is completely neutral.

If X = (X_1, \ldots, X_K)\sim\operatorname{Dir}(\alpha) is drawn from a Dirichlet distribution, then X is completely neutral.

[edit] See also

Generalized Dirichlet distribution

[edit] References

  1. ^ R. J. Connor and J. E. Mosiman 1969. Concepts of independence for proportions with a generalization of the Dirichlet distibution. Journal of the American Statistical Association, volume 64, pp194--206