Joint probability distribution
From Wikipedia, the free encyclopedia
In the study of probability, given two random variables X and Y, the joint distribution of X and Y is the distribution of the intersection of the events X and Y, that is, of both events X and Y occurring together. In the case of only two random variables, this is called a bivariate distribution, but the concept generalizes to any number of events or random variables.
Contents |
[edit] The discrete case
For discrete random variables, the joint probability mass function is
Since these are probabilities, we have
[edit] The continuous case
Similarly for continuous random variables, the joint probability density function can be written as fX,Y(x, y) and this is
where fY|X(y|x) and fX|Y(x|y) give the conditional distributions of Y given X = x and of X given Y = y respectively, and fX(x) and fY(y) give the marginal distributions for X and Y respectively.
Again, since these are probability distributions, one has
[edit] Joint distribution of independent variables
If for discrete random variables for all x and y, or for continuous random variables for all x and y, then X and Y are said to be independent.
[edit] Multidimensional distributions
The joint distribution of two random variables can be extended to many random variables X1, ..., Xn by adding them sequentially with the identity