Talk:Expected value

From Wikipedia, the free encyclopedia

This article is within the scope of WikiProject Statistics, which collaborates to improve Wikipedia's coverage of statistics. If you would like to participate, please visit the project page.

WikiProject Mathematics
This article is within the scope of WikiProject Mathematics, which collaborates on articles related to mathematics.
Mathematics rating: B Class Top Priority  Field: Probability and statistics
One of the 500 most frequently viewed mathematics articles.

Contents

[edit] WAY TOO COMPLEX AN EXPLANATION

Deleted text:

Similarly, in computer science, the expected value of X is defined as

\operatorname{\mathbb{E}}[X] = \sum_i iP(X = i)

where X is an algorithm with different, weighted subroutines, and i is a particular algorithm path.

Populus 17:52, 16 Aug 2003 (UTC)

[edit] Removed

In general expectation is what is considered the most likely to happen. A less advantageous result gives rise to the emotion of disappointment. If something happens that is not at all expected it is a surprise. See also anticipation.--Jerryseinfeld 01:02, 1 Jan 2005 (UTC)

Fixed the redirect to point to expectation. Ben Cairns 03:46, 2 Feb 2005 (UTC).

[edit] American roulette wheel example

The Article reads .. An American roulette wheel has 38 equally likely outcomes. A winning bet placed on a single number pays 35-to-1 (this means that you are paid 35 times your bet and your bet is returned, so you get 36 times your bet). So considering all 38 possible outcomes, the expected value of the profit resulting from a $1 bet on a single number is:

E(X)=(-$1x37/38)+($35x1/38)=-$0.0526.

(Your net is −$1 when you lose and $35 when you win.) Therefore one expects, on average, to lose over five cents for every dollar bet, and the expected value of a one dollar bet is $0.9473. In gambling or betting, a game or situation in which the expected value of the profit for the player is zero (no net gain nor loss) is commonly called a "fair game."

someone has changed the article. Suggest them to revert changes till such time as the arguments come to a conclusion Sanjiv swarup (talk) 02:28, 25 April 2008 (UTC)

Argument for change The roulette table example has a flaw - it compares apples to oranges. Either you use the "amount pushed across the table" in each term, or you need to use "net change".

In the "amount pushed across the table" case, I agree that the second term is $36 X 1/38. But, in all cases to play, you have to put $1 down FIRST (it just so happens you get your own dollar back if you win). Using that logic, the formula should be (-$1 X 38/38) + (+$36 X 1/38), which computes out to be about $0.0526.

In the "net change" scenario, I agree that the first term is -$1 X 37/38. But, since one dollar of the 36 you get was yours at the beginning of the spin, you only net $35 on a win. Thus the formula would be (-$1 X 37/38) + (+$35 X 1/38), which still yields about $0.0526. So, one should expect to lose over five cents for every dollar bet. - Anonymous


You are absolutely right - will you fix it or should I? PAR 9 July 2005 04:17 (UTC)

Argument for no-change Skand swarup (talk) 13:13, 24 April 2008 (UTC) If one uses the "amount pushed across the table" in each term, the second term is $35 X 1/38 because one puts $1 down first and gets $36. "Amount pushed across the table" = $(-1+36)=$35. In all cases to play, one has to put $1 down first but "amount pushed across the table" cannot be -$1 in all cases! You will win if you get your number. So, the first term is -$1 X 37/38. Therefore, the terms will remain the same in the "net change" and "amount pushed across the table" scenarios.

[edit] American roulette wheel example

The Article reads .. An American roulette wheel has 38 equally likely outcomes. A winning bet placed on a single number pays 35-to-1 (this means that you are paid 35 times your bet and your bet is returned, so you get 36 times your bet). So considering all 38 possible outcomes, the expected value of the profit resulting from a $1 bet on a single number is:

E(X)=(-$1x37/38)+($35x1/38)=-$0.0526.

(Your net is −$1 when you lose and $35 when you win.) Therefore one expects, on average, to lose over five cents for every dollar bet, and the expected value of a one dollar bet is $0.9473. In gambling or betting, a game or situation in which the expected value of the profit for the player is zero (no net gain nor loss) is commonly called a "fair game."


Isn't there an error in the example ? One Should it not be (-$1 X 37/38) + (+$36 X 1/38) and not (-$1 X 37/38) + (+$35 X 1/38), since you get your $1 back? - Anonymous


Answer: No, there is no error in the example.

By Skand swarup (talk) 05:10, 18 April 2008 (UTC) The second term should not be (+$36 X 1/38) because the profit = $35, and not $36. The $1 that one gets back is not a profit.

Opinion: Isn't there an error in the example ? Two To my way of thinking, whenever you place a one-dollar bet, that dollar leaves your possession, so the odds of losing a dollar are 38/38. On the other hand, when you win, you receive $36, and the odds of that are 1/38. Unfree (talk) 21:34, 26 April 2008 (UTC)

Answer: No, there is no error in the example. Skand swarup (talk)

I agree that whenever one places a one-dolar bet, that dollar leaves your posession. But, the odds of LOSING a dollar is 37/38 because on one instance, you will get it make plus make a profit. If you say that the odds of LOSING a dollar are 38/38, you are saying that you are certain to lose one dollar whenever you place a bet. Why will anyone place a bet then? We are calculating expected profit and profit is $35 when you win and not $36.

[edit] changed

I changed the section "Nonnegative variables", that consisted in a representation formula for the expected values of nonnegative random variables, to a subsection called "representation", in which I write a formula for the general momentum of a random variable. Moreover, I removed (in this subsection) the distinction between continuous and discrete random variables, since the formula holds without distinction. gala.martin

[edit] Roman vs. blackboard bold

Is there a reason the article switches from using \mathrm{E}X\, to \mathbb{E}X halfway through, or shall I change them all to roman E's for consistency? TheObtuseAngleOfDoom 21:19, 11 December 2005 (UTC)

No reason that I know of. PAR 22:30, 11 December 2005 (UTC)

No reason that I know. I would prefer to change all \mathrm{E}X\, to \mathbb{E}X as usual in math literature. gala.martin

Be bold! It's better to have a single form in the article. --Mgreenbe 22:47, 11 December 2005 (UTC)
I would like EX rather than \mathbb{E}X as the former is more bearable inline, where it does need to be a png picture, but rather plain text. Wonder what others prefer. Oleg Alexandrov (talk) 00:32, 12 December 2005 (UTC)

I've gone ahead and been bold, as suggested, switching them all to roman. I also switched the \mathbb P's to roman as well. TheObtuseAngleOfDoom 14:53, 12 December 2005 (UTC)

Thanks! Oleg Alexandrov (talk) 17:48, 12 December 2005 (UTC)

[edit] "Fair game" - Expected Value = 0?

I've always thought that a "fair game" is one in which the expected value is 0 - over many repetitions the player stands to neither gain nor lose anything. I don't quite understand the "half stake" that's in their right now (end of intro paragraph). I'm planning on changing it back to the definition that I had put down, but maybe it's just something that I don't know about expected values so I wanted to make sure. -Tejastheory 17:58, 26 December 2005 (UTC)

Yes, the "stake" additions are wrong. The previous wording was not wonderful either, though. In a simple 2-person game, both players pay a "stake" into a pool, then one of them wins the pool. If the game is fair, then the expected income is half the total stake (not half of one player's stake as it says now). That "half" is only for 2-player games. The expected profit (income minus expenditure) is 0, which is true for fair games with any number of players. We should describe it in terms of profit, without using gambling words like "stake", as that is more general and easier to understand. --Zero 22:41, 26 December 2005 (UTC)


[edit] "Properties the Expected Values has not

We cite some properties the expected values 'has not' (functional non-invariance and non-multiplicativity). It is not significative to write the properties a mathematical object has not. Otherwise, we should write too many... I think it would be better to remove these properties, or to move them at the bottom of the list of properties. This concerns in particoular the "functional non-invariance".

gala.martin

I changed the order of the list of properties, as explained above. Gala.martin 18:36, 28 January 2006 (UTC)

[edit] Question over notion of "fair game"

The article strikes me as okay - except for end of the 2nd paragraph that goes: In gambling or betting, a game or situation in which the expected value for the player is zero (no net gain nor loss) is called a "fair game."

While this seems to be convention (I have several references stating similar) the notion is false.

To determine if a game is fair, the probability of events and the odds offered are insufficent. You also need to consider the betting strategy used.

This can easily be seen in something I call the "fair bet paradox":

THE FAIR BET PARADOX: Imagine Alice and Bob start with $1000 each and both bet "heads" on an unbiased coin. A "fair bet", right? Well, let Alice bet just $1 per toss while Bob bets HALF HIS CURRENT FUNDS. Under this betting strategy, Alice's funds fluctuate around $1000 while Bob SWIFTLY GOES BROKE. True!

See the word doc "the fair bet paradox" downloadable from www.geocities.com/multigrals2000 for more info. The paradox is not a consequence of the gambler's fallacy or Bob's inital lack of adequate funds. You can offer Bob unlimited credit at 0% interest and he'd still go broke. Likewise if you raise the probability of "heads" to a bit above 0.6 (on which Alice would become rich). You can also solve for the betting strategy of betting a random fraction of your funds BUT THE GENERAL CASE SEEMS TO BE UNSOLVED (True?). Good Luck to anyone who solves it.

I'd like to edit the main page but don't feel confident to do so. If someone else does so, could you please leave the first two paragraphs as they are and perhaps add an explanatory bit below it (in brackets?) as I'd like to do something on the subject latter and would like to refer to the present material. Okay?

Yours, Daryl Williams (www.geocities.com/multigrals2000)

It's not true that Bob will go broke in this game; there is a non-zero probability that Bob will break _the House_ and win whatever ammount the House has (be it 1 million, 1 billion, or 1 googol). Unless Bob is betting against someone with infinite resources, like Cthulhu. Albmont 12:02, 9 March 2007 (UTC)

Or the House and bob have unlimited credit?

Under reasonable circumstances, that "non-zero probability" is usually very very small. With the above example (Alice and Bob start with $1000 each, Bob bets half his curent funds, etc) with 100 tosses, Bob needs 64 "heads" or more to break even or win (see note below).

The chance of this occurring is: [100!/64!36! + 100!/65!35! + ... +100!/100!0! ]*(1/2)^100 which = approx 1.9 * 10E-6 or just under 2 chances in a million.

So, under Bob's betting strategy, can the game honestly be considered "fair"? Just 2 Bobs out of 1 million on average breaking even or wining? Fair? Not something I'd accept as a bet. And if you use 1000 tosses or more, the odds get even worse.

THE COMMONLY HELD NOTION THAT 'FAIR ODDS' MEAN 'FAIR GAME' IS FALSE. What's needed is "fair odds" PLUS "appropriate betting strategy" = "fair game"

Many gamblers (and investors?) are robbing themselves even more than necessary due to adopting betting strategies like Bob above. I'd like to do something to perhaps reduce this (if possible)

Okay?

Anyone want to help in this endeavour? If so, contact me.

note 1: the number of heads Bob needs is from

   (3/2)^H  * (1/2)^(100-H)  *  1000 > or = 1000 or
    h > or = 100 * ln(2)/ln(3) = 63.0929   
  where H is the minimum number of heads needed)

Daryl Williams 03:24, 5 June 2007 (UTC)

So do you have an alternative suggestion on how to define a 'fair game'? iNic (talk) 00:38, 26 March 2008 (UTC)

[edit] E[f(X)]

Is there a general table giving E[f(X)] for varying functions f and with conditions for X? For example, I know that there is a closed formula for E[exp(X)] whenever X is normal, I could find it under log-normal distribution, but, if I didn't know it, I would be completely lost trying to find it. Albmont 10:02, 19 December 2006 (UTC)


Do you mean the law of the unconsious statstician(sp?)?. It says that for a distrubution X with f being its pdf E[g(X)]=\sum_{x\varepsilon X}g(x)\cdot P(X=x)


For continuous case E[g(X)]=\int_{x\varepsilon X}g(x)\cdot f(x) dx

Hyperbola 08:48, 12 September 2007 (UTC)

[edit] Assumption missing?

In the "Iterated expectation for discrete random variables" section, isn't the assumption

\left( \sum\limits_y \operatorname{P}(Y=y|X=x) \right) = 1\,

true only if X and Y are defined over the same probability space?

It says so in the article about the law of total expectation.

Helder Ribeiro 20:11, 2 January 2007 (UTC)

No. If you sum up the total probability of every event, you have to get 1. Something is going to happen. If the sum of all the events is only 0.9, then there is a 10% chance that Y takes no value at all? That doesn't make sense. Therefore, no matter what spaces things are defined on, the total probability summed over the whole space is always 1. I think that other article is in error. There has to be some joint probability distribution, but X and Y can take values in completely different spaces. - grubber 16:27, 9 March 2007 (UTC)

[edit] How is the expected value different from the arithmetic mean?

This page says that another term for "expected value" is "mean". I find that dubious - especially since the page mean says that the expected value is sometimes called the "population mean" - which I also find to be dubious. If the expected value is the same thing as a mean, then the pages should be merged. If not, this page should explain the difference. Fresheneesz 01:09, 15 February 2007 (UTC)

Expected value and mean are not the same thing. Means are defined on sets, for example the "arithmetic mean of a set of numbers". Expected values are used in stochastic settings, where you take the expected value of a random variable; there is some underlying probability distribution involved in expected values. I'm not familiar with "population mean", but I have a hard believing that that would be more than just a special case of expected value. You really do need a r.v. in order to take expected values. - grubber 16:20, 9 March 2007 (UTC)
If my understanding is correct, expected values are a mathematical concept - it's a function performed on a probability distribution. Means are a statistical concept - population mean being the mean of the entire population, and sample mean being an attempt to discover that population mean (or something approximating it). BC Graham (talk) 22:34, 25 March 2008 (UTC)

[edit] For two stochastic variables X and Y.

Discrete

E[XY]=\sum\limits_x \sum\limits_y xyf_{X,Y}(x,y)

Continuous

E[XY]=\int_{-\infty}^\infty \int_{-\infty}^\infty xyf_{X,Y}(x,y)\operatorname{d}y \operatorname{d}x

90.227.190.26 23:19, 5 April 2007 (UTC)

[edit] Dice a poor example

The average roll of a single die was the canonical example when I studied this too, but I feel it is a poor and misleading example. The expected value of 3.5 for a dice roll only makes sense if you are accumulating the sum of each dice roll--or if you are being paid $1 for a 1, $2 for a 2, etc. The pips on dice are usually interpreted as symbols, and adding the pips from consecutive rolls is rather unnatural and distracting. —Preceding unsigned comment added by Roberthoff82 (talkcontribs)

I disagree. I think it raises a very important characteristic of expected values at the very beginning of learning them, which is that they are not the intuitive "this value is what you would expect on any given event," but rather a statement about the distribution from which the value is being drawn. It is important to introduce that an expected value does not have to be a possible value.BC Graham (talk) 22:39, 25 March 2008 (UTC)
Clearly, "value" is being used in two senses: firstly, the "value," or number of dots, appearing on the die, and secondly, the "mean (arithmetic average) of the values" which are likely to appear in the long run. In one sense, the expected value is impossible, but in the other, not only is it quite possible, it's a mathematical certainty! Unfree (talk) 21:18, 26 April 2008 (UTC)

[edit] Subscripting

I guess it might be useful to subscript the \operatorname{E} where necessary. Surely, this

\operatorname{E_Y} \left( \operatorname{E_X}(X|Y) \right)

is easier to read than this

\operatorname{E} \left( \operatorname{E}(X|Y) \right) —Preceding unsigned comment added by 137.132.250.8 (talk) 13:57, 26 February 2008 (UTC)
What does the vertical line (bar, pipe) stand for? Unfree (talk) 21:46, 26 April 2008 (UTC)
It is conditional expectation. --MarSch (talk) 11:38, 27 April 2008 (UTC)