Wikipedia:Reference desk/Archives/Mathematics/2007 August 22

From Wikipedia, the free encyclopedia

Mathematics desk
< August 21 << Jul | August | Sep >> August 23 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


Contents

[edit] August 22

[edit] Probability Tree App

I'm looking for a simple app which will let you add branches with probabilities to a probability tree, and will give you the compound probabilities of each node (or each end node). I've searched a bit and couldnt find anything, I can't believe there isnt a simple java app out there that will do this. I've tried doing it the old fashioned way but it's fairly big. Capuchin 08:27, 22 August 2007 (UTC)

You could also try looking for a decision tree. Years ago I studied the lenghty manual of a DOS-based decision tree program that was probably shareware, and it would be nice to find a wondows version. 80.0.121.94 16:03, 27 August 2007 (UTC)

[edit] Euler line

Hello. How did Leonhard Euler prove the Euler line? Did he prove his theory algebraically or by trial and error? Thanks in advance. --Mayfare 13:47, 22 August 2007 (UTC)

Well, he didn't prove it by trial and error, because that would be impossible; trial and error do not constitute mathematical proof. I don't know Euler's proof, and don't currently have access to the work cited in the article, but I would guess it was more geometric than algebraic in nature. Algebraist 16:14, 22 August 2007 (UTC)
Euler's proof, although essentially geometric, leans heavily on the massive application of elementary algebra: [1]. He first derives formulas for determining the locations of the various centres, and then uses these to show they lie on one line.  --Lambiam 18:39, 22 August 2007 (UTC)

[edit] Online Percent Gain Calculator

Where can I find an online percent gain calculator? I'm looking for one where I can enter a current value and then enter the desired percent gain and be given the resulting new total. --Gary123 15:31, 22 August 2007 (UTC)

No idea, have you tried google?
Failing that, it isn't difficult to do with a calculator, or, a spreadsheet.
If p% is the desired percentage gain, C = current value, N = new value;
N = C \times(1 + \frac{p%}{100})
As an example, say you had £38,000, and wanted a gain of 8% over the next year, then you'd have;
N = 38000 \times(1 + 0.08)
N = 41040
Hope that helps Richard B 15:57, 22 August 2007 (UTC)
See http://www.percent-change.com/.  --Lambiam 18:43, 22 August 2007 (UTC)

[edit] Simple probability question

I haven't been to a math class in a long time and I am completely forgetting somethign I think I once knew.

What formula dexcribes the probability of getting a value out of "n" dice rolls?

Specifically, say soemthing has a 1 in 20 chance of happening, and you roll that chance 4 times. The probability won't be 20% (I think it's quite a bit less), but I forget how to find what it would be.

I appreciate it.

The easy way to do this is to calculate the probability of it not happening and then subtract that from 1. So in your case, it has a 19/20 chance of not happening on one roll, so it's (19 / 20)4 for four rolls and 1-(19/20)^4\approx18.5\%. Donald Hosek 18:48, 22 August 2007 (UTC)

Thanks!

(after edit conflict) I'm not sure I completely understand your question, but hopefully you'll find an answer in this reply. In the following, probabilities are a number in the range from 0 to 1. A "1 in 20 chance" is a probability of 1/20 = 0.05. To get the probability of a bunch of independent things all happening, you just multiply their probabilities. For example, if on any given day there is a probability of 0.2 of rain, and 0.01 of an earthquake, then you have a probability of a rainy day with an earthquake of 0.002 (assuming these are independent events). The probability of something not happening, is one minus the probability of it happening (where I assume it either does or does not rain). So the probability of no rain is then 0.8, that of no earthquake is 0.99, and that of no rain-cum-earthquake is 0.998.
So now, getting at least one six in 4 die rolls is the same as not getting not even a single six in those 4 die rolls. So the probability of getting at least one six in 4 die rolls is one minus the probability of not getting a six in 4 die rolls. So this is 4 times something happening (a non-six coming up), and the probability of all 4 happening is the product of their individual probabilities. The probability of not getting a six on a single die roll is 1 − 1/6 = 5/6. For this to happen 4 times in row, we have the probability of 5/6 × 5/6 × 5/6 × 5/6 = 625/1296. So then to get at least one six, we have a chance of 1 − 625/1296 = 671/1296.
In general, if the probability of an event is p, and is independent on each try, then on n tries we have a probability of
1 − (1 − p)n.
 --Lambiam 19:07, 22 August 2007 (UTC)

[edit] Confused About Probability

I do not understand how the concepts of probability relate to the following example. Let's pick a day -- say, tomorrow. Now, tomorrow, I will either live through the entire day or die at some point during the day. Obviously, we have no way of knowing which will happen -- but, it is clearly one of those two events (live / die). However, it does not seem intuitively correct to say that "I have a 50% chance of dying tomorrow." Or "I have only a 50% chance of surviving through tomorrow". So, how do the concepts of probability enter this equation? Or, in fact, do I actually have only a 50% chance of living tomorrow? Thanks. (Joseph A. Spadaro 19:54, 22 August 2007 (UTC))

What you have there is known as the principle of indifference. Obviously, it's not a very useful estimate of the probability, or we should be surprised to see anyone live a week. Better would be to consult tables of ages at death, and (perhaps with some interpolation) evaluate the corresponding probability distribution: the probability of dying on one's 19481st day of life, the 19482nd, etc. Obviously, over all ages the probabilities sum to 1; but individually (for any given day, like tomorrow) the chances are rather good that you'll live. Good luck with that! --Tardis 20:20, 22 August 2007 (UTC)
To clarify, the principle of indifference doesn't apply at all to the question of whether you'll live or die tomorrow; it only applies to possibilities that don't differ in any relevant physical way, like cards in a deck. Even if you do have reason to believe you have a 50% chance of dying tomorrow, you can't cite the principle of indifference as justification. (Well, unless you put a bullet in a revolver and pull the trigger three times, in which case you're really applying the principle of indifference to the six chambers, not to your life as such.) -- BenRG 23:02, 22 August 2007 (UTC)
The concept of probability is related to what you know about something. It is also empirically grounded in observations: how often something happened in how many similar cases. If you are studying the cat population in Rome, and on 2320 cats examined you find 311 tabbies, then the best estimate for the likelihood that your next cat will be a tabby is 311/2320. That is what you should use in considering whether to take a bet or not on the tabbihood of cat # N+1. However, if you then see on the evening news that a quick-spreading virus is killing the cat population, except that tabbies appear to be immune, you now know something that makes 311/2320 a poor estimate. Very likely you do not have enough information to produce a better estimate. Given this lack of information, the correct answer to the question "what is the probability?" is not "50%" but a simple "I don't know".
If your case is sufficiently similar to that of most people around you, having observed that on the average substantially more than 99 out of every 100 people survive each day until the next morrow, you can claim with some confidence that your chances of survival until tomorrow are better than 99%. But note that this is based on an assumed similarity. If you're currently engaged in an armed bank robbery, some of your comrades-in-crime have proven a bit too trigger happy, and the building has now been surrounded by an assault-rifle toting squad, there might be occasion to lower that estimate.  --Lambiam 20:43, 22 August 2007 (UTC)
For various other approaches, see probability interpretations. Algebraist 21:50, 22 August 2007 (UTC)

I do not know why you say the chances of dying tomorrow is 50% because it does not make any sense. Say if you know the life expectancy of citizen of your country is say 50 years. That means half of the citizen of your country will survive to the age of 50. 50 years is equal to 365 * 50 days = 18250 days. Assume the probability of dying in a single day is X.

probability of NOT dying in a single day is 1 - X = Y
probability of NOT dying for 18250 days consequentially is Y^18250 = 0.5 (life expectancy of 50 years)
So Y = 18250 root of 0.5 = e^(Log(0.5)/18250) = 0.999962020054
So X = 0.000037979946
the probability of dying in a single day is 0.000037979946

Which is of course no where near 50%

Note: You cannot apply the "Principle of Indifference" when you have knowledge relevant to the problem at hand! Nor can you deliberately choose to be ignorant of the knowledge which you already have.

PS: Say if you do live in a country where you really have a 50% chance of dying tomorrow. I would say the life expectancy of a citizen in your country must be 1 day!!!

202.168.50.40 22:44, 22 August 2007 (UTC)

Two outcomes by no means implies two equally likely outcomes. From a well-shuffled standard deck of 52 playing cards draw a single card. It is equally likely to be red or black. It is less likely to be a spade than not to be a spade, though each suit (clubs, hearts, spades, diamonds) is equally likely. It is even less likely to be an ace than not to be an ace, though each value (ace, two, three, …, ten, jack, queen, king) is equally likely. And there is only one chance in 52 that the chosen card is precisely the ace of spades, though each card is equally likely.
Thus we speak of a probability distribution. For example, a mortality table shows that the first year of life is more lethal than the rest of the first decade combined; in a large body of English text such as Wikipedia, words like "the", "of", "and" occur far more often than others, consistent with Zipf's law; and the number of particles emitted by uranium atoms spontaneously decaying in a given time interval follows a Poisson distribution. --KSmrqT 01:02, 23 August 2007 (UTC)
(after edit conflict) The actual death rate is heavily dependent on age: newborns are far more likely to die in their next year than, say, ten year old children. 90-year-olds are also much more likely to die than 20-year-olds in their next year.
Looking at a specific standard mortality table for life assurance, I think based on the UK, if you're a man and survive to age 20, then your expectation of life is around 62 years (i.e. expectation of age at death is 82). The probability of death before your 21st birthday is around 0.02%, or around 0.00005% in any one day. If there are 2 million 20 year olds, then the expectation is that one will die per day - i.e. the chance of dying today for a 20 year old is 1 in 2 million, roughly.
If you survive to age 50, then your expectation of life is now 33 (i.e. expectation of age at death is 83). The probability of death before your 51st birthday is around 0.07%, or around 0.0002% in any one day. If there are 2 million 50 year olds, then the expectation is that four will die per day.
If you last until you're 80, then your expectation of life is now 8 (i.e. expectation of age at death is 88). The probability of death before your 81st birthday is around 6%, or around 0.02% in any one day. If there are 2 million 80 year olds, then the expectation is that over 300 will die per day.
If you make it to 99, then your expectation of life is now 3 (i.e. expectation of age at death is 102). The probability of death before your 100th birthday is around 29%, or around 0.1% in any one day. If there are 2 million 99 year olds, then the expectation is that nearly 2000 will die per day.
For the current oldest man in the world, at 112 years old, the standard tables gives a life expectancy of around 1.6 years, and probability of death over the next year as 46%, or approx 0.2% in any one day. If there are 2 million 112 year olds, then the expectation is that nearly 3400 will die per day.
As you can see, the death rate for people over 100 is thousands of times higher than for 20-year-olds, but even at 112 years old, you've still got a 99.8% chance of making it to tomorrow. If you're age 20, then you've got a 99.99995% chance of waking up tomorrow.
So it depends on age greatly, and also where you are from. The above figures are estimates for the UK - other countries would be different. Richard B 01:24, 23 August 2007 (UTC)

[edit] Infinity

When items are innumerable and have no end or finity, we use the concept of infinity. So, we can say, for example, that there are an infinite amount of integers ... or an infinite amount of real numbers ... or an infinite amount of multiples of 5. (etc.) So, let us take a look at the following three sub-divisions / categories: all of the integers; all of the even integers; all of the odd integers. There are an infinite amount of integers; there are an infinite amount of even integers; there are an infinite amount of odd integers. Is it correct or incorrect, then, to say: (A) There are an equal amount (i.e., an infinite amount) of total integers as there are of even integers? And, is it correct or incorrect to say: (B) There are twice as many total integers as there are even integers? Neither seems to make intuitive sense. Statement A seems wrong because there seems to be twice as many total integers as there are even integers ... and therefore, they can't have an equal amount. Statement B seems wrong because infinity is infinity is infinity ... no? That is, you can have "infinity" ... but there is no such thing as "twice infinity" or "half infinity" or "infinity times 2" ... right? Or wrong? I'm confused here. Thanks. (Joseph A. Spadaro 20:06, 22 August 2007 (UTC))

When you get into this kind of "infinity", you're talking about cardinality of sets. In some sense there should be "more" integers than there are even integers, but this is false. The words "larger" or "more" cease making sense when you talk about this kind of thing.
All the three sets you described above have the same cardinality. Hence, those sets are "the same size." Indeed, even the set of rational numbers is the same size as the sets you mentioned. However, the cardinality of the set of real numbers is "bigger". –King Bee (τγ) 20:35, 22 August 2007 (UTC)
For an entertaining account of such issues, see Hilbert's paradox of the Grand Hotel.  --Lambiam 20:48, 22 August 2007 (UTC)
You might also want to look at Galileo's paradox, which deals with the exact issue you raise in your question. AndrewWTaylor 07:43, 24 August 2007 (UTC)

As King says, "(A) There are an equal amount (i.e., an infinite amount) of total integers as there are of even integers" is correct. In fact, this is close to a definition of what it means for a set to be infinite. If a set and has the same cardinality as a proper subset of itself, then it's an infinite set. But not all infinities are equal, as King also says. Speaking very loosely, if N is an infinite number then N+2 and 2N are the same number, but 2 to the Nth power is a larger number. --Anon, August 22, 2007, 21:29 (UTC).

Basically, (A) and (B) correspond to two different ways to measure amount, and there is no reason to expect that they would give the same results. (A) corresponds to cardinality, and (B) corresponds to asymptotic density. 00:00, 23 August 2007 (UTC)
And the problem mentioned in the article about Galileo, the comparison of two line segments, is called measure. The thing is, there are an awful lot of things that are infinite in one way or another, and "one way or another" is the only way to describe them. They're infinite, so in that sense all problems that deal with the infinite are similar, but it's more complicated than that. To compare the infinite number of points on a line with the infinite number of points on another line, and derive measure, is very different from either of the other two problems, and from many other problems that crop up whenever you deal with anything infinite. Distinctions have to be drawn, or it all becomes a muddle. In set theory and therefore in much of counting, cardinality (or in some cases ordinality) is very descriptive, but there are other considerations. As the last guy said, there's asymptotic density (a very practical measurement in number theory and especially in computer science), or what about a different view of things entirely? Let's come up with a formal mathematical system that is motivated purely by the desire to use infinite numbers, and skip the applications. The surreal numbers would fit the bill with room to spare. And I've heard there's a similar system, a field with infinitesimals, that can be taken as the basis of calculus. It's not practical, since epsilon-delta proofs are logically equivalent and more intuitive, but it does still apply as a consistent description of the values involved in the calculations. Just like the rest of math, you describe the problem you're working on and recognize that other problems might need different tools. So, see, all infinite numbers are in a sense the same, but in another sense they may be different. Black Carrot 09:43, 24 August 2007 (UTC)
An interesting aside: are positive and negative infinity different? Or, as in the case of the projective plane, is there perhaps a continuum of distinct infinites, each equal in size to the others but all different in sign, in other words in direction? Black Carrot 10:02, 24 August 2007 (UTC)
Topologically, you can extend the real numbers by defining either one or two infinities - see compactification (mathematics), real projective line and extended real number line. Gandalf61 13:22, 24 August 2007 (UTC)