Craps principle

In probability theory, the craps principle is a theorem about event probabilities under repeated iid trials. Let E_1 and E_2 denote two mutually exclusive events which might occur on a given trial. Then for each trial, the conditional probability that E_1 occurs given that E_1 or E_2 occur is

\operatorname{P}\left[E_1\mid E_1\cup E_2\right]=\frac{\operatorname{P}[E_1]}{\operatorname{P}[E_1]+\operatorname{P}[E_2]}

The events E_1 and E_2 need not be collectively exhaustive.

Proof

Since E_1 and E_2 are mutually exclusive,

 \operatorname{P}[E_1\cup E_2]=\operatorname{P}[E_1]+\operatorname{P}[E_2]

Also due to mutual exclusion,

 E_1\cap(E_1\cup E_2)=E_1

By conditional probability,

 \operatorname{P}[E_1\cap(E_1\cup E_2)]=\operatorname{P}\left[E_1\mid E_1\cup E_2\right]\operatorname{P}\left[E_1\cup E_2\right]

Combining these three yields the desired result.

Application

If the trials are repetitions of a game between two players, and the events are

E_1:\mathrm{ player\ 1\ wins}
E_2:\mathrm{ player\ 2\ wins}

then the craps principle gives the respective conditional probabilities of each player winning a certain repetition, given that someone wins (i.e., given that a draw does not occur). In fact, the result is only affected by the relative marginal probabilities of winning \operatorname{P}[E_1] and \operatorname{P}[E_2] ; in particular, the probability of a draw is irrelevant.

Stopping

If the game is played repeatedly until someone wins, then the conditional probability above turns out to be the probability that the player wins the game.

Etymology

If the game being played is craps, then this principle can greatly simplify the computation of the probability of winning in a certain scenario. Specifically, if the first roll is a 4, 5, 6, 8, 9, or 10, then the dice are repeatedly re-rolled until one of two events occurs:

E_1:\textrm{ the\ original\ roll\ (called\ 'the\ point')\ is\ rolled\ (a\ win) }
E_2:\textrm{ a\ 7\ is\ rolled\ (a\ loss) }

Since E_1 and E_2 are mutually exclusive, the craps principle applies. For example, if the original roll was a 4, then the probability of winning is

\frac{3/36}{3/36 + 6/36}=\frac{1}{3}

This avoids having to sum the infinite series corresponding to all the possible outcomes:

\sum_{i=0}^{\infty}\operatorname{P}[\textrm{first\ }i\textrm{\ rolls\ are\ ties,\ }(i+1)^\textrm{th}\textrm{\ roll\ is\ 'the\ point'}]

Mathematically, we can express the probability of rolling i ties followed by rolling the point:

\operatorname{P}[\textrm{first\ }i\textrm{\ rolls\ are\ ties,\ }(i+1)^\textrm{th}\textrm{\ roll\ is\ 'the\ point'}]
 = (1-\operatorname{P}[E_1]-\operatorname{P}[E_2])^i\operatorname{P}[E_1]

The summation becomes an infinite geometric series:

\sum_{i=0}^{\infty} (1-\operatorname{P}[E_1]-\operatorname{P}[E_2])^i\operatorname{P}[E_1]
= \operatorname{P}[E_1] \sum_{i=0}^{\infty} (1-\operatorname{P}[E_1]-\operatorname{P}[E_2])^i
 = \frac{\operatorname{P}[E_1]}{1-(1-\operatorname{P}[E_1]-\operatorname{P}[E_2])}
= \frac{\operatorname{P}[E_1]}{\operatorname{P}[E_1]+\operatorname{P}[E_2]}

which agrees with the earlier result.

References

Pitman, Jim (1993). Probability. Berlin: Springer-Verlag. ISBN 0-387-97974-3.