Talk:Uncertainty
From Wikipedia, the free encyclopedia
Contents |
[edit] Definition
This is good article. Don't ruin it with a narrow-minded definition of uncertainty. Heisenberg's principle has little or nothing to do with the general subject. When one can analyze errors one does not have real uncertainty! Dave 21:48, 22 December 2005 (UTC)
There is no definition of uncertainty on this page. What is it? Also, should have some stuff about the Electron Microscope and Heisenberg, no?
This page needs work. It needs to start with a straightforward definition.
[edit] Removal of evaluation of uncertainty
I removed the following two paragraphs:
- Evaluating the degree of uncertainty, measuring it, and compensating for it is a basic process which is part of many activities and processes.
Basic process is my nemesis.
- As a situation is evaluated and more and more data is gathered and thought is taken on a matter, hypotheses are made, experiments are performed, and gradually the degree of uncertainty is reduced as conclusions are drawn, but never wholly eliminated, except in certain special circumstances as where knowledge is defined by definition.
This is one possible view, but I'm sure some epistemologists would disagree. I don't think this is a universally acceptable statement.
--Ryguasu 01:04 Dec 3, 2002 (UTC)
- After reading The Black Swan, I have to agree with you Ryguasu, although that doesn't mean that hypotheses shouldn't be made. --AB (talk) 15:58, 22 April 2008 (UTC)
[edit] Uncertainty in physics
How is uncertainty defined/used in Physics? What about error analysis? --Aug 21, 2005
[edit] Decomposition of uncertainty by Peter Fisher
Here's an analysis of the concept I picked up (hopefully correctly) from a lecture by Peter Fisher - I'm sure it's in one of his published papers. I'd like to store it somewhere and this seemed like a good place, but maybe it should be discussed a bit first.
Uncertainty can be divided into well-defined and poorly defined parts. The well-defined uncertainty can be analysed with probability theory. The poorly defined can be divided into vagueness and ambiguity. Vagueness can be analysed with fuzzy set theory. Ambiguity can be divided into non specificity and discord. Non specificity can be analysed with possibility theory and discord can be analysed with ontologies.
--Ari Jolma Feb 1, 2007
[edit] Let's not reinvent the issue
Douglas Hubbard just wrote the book "How to Measure Anything: Finding the Value of Intangibles in Business" published by John Wiley & Sons. He has a pretty complete discussion on every definition of uncertainty and even addresses the quote by Knight. Uncertainty is simply the lack of complete certainty. As he states it, uncertainty is a state of knowledge where 100% correct assessment of a state (past, present or future) is not possible. This is consistent with all the main schools of thought about uncertainty including, probability theory, the decision sciences, statistics, information theory and physics.
Contrary to the claims of the person at the top of this discussion page, it does not confuse the topic to show that these uses of the term are quite consistent. Nor, as the same person states, does it mean that if we just removed error, we would have no uncertainty. In modern physics uncertainty is not just a function of error in observation but an intrinsic and irreducible aspect of reality - at least at the subatomic level.
Hubbard also clarifies what Knight found unclear: the difference between uncertainty and risk. In the pragmatic sense of people who do risk analysis all the time, risk is simply uncertainty about a state of affairs where at least one possible state involves a loss (something undesirable). To take it further, Hubbard defines the measurement of uncertainty as the assignment of probabilities to the set of possible states such that the p(union of all states)=1 and p(intersection of exclusive states)=0 and the p(each state)>0. Finally, the measurement of risk is simply a measurement of uncertainty together with an assignment of loss (or pdf of losses) for each of the uncertain states. Many states may have a loss=0 but if at least one does state does not have loss>0 then there is, by definition, no risk. Knight's quote notwithstanding, this actually is the practical use of the term in the risk-measurement industries and professions.
If you are wondering why a book with that title would talk so much about uncertainty, it is because Hubbard defines measurement as observations that reduce uncertainty expressed as a quantity. This is also the de facto use of the term "measurement" by all of the empirical sciences, even if they don't explicitly define it that way. All empirical measurements in any peer-reviewed journal must report error of a measurement and the error is presumed not to be zero. Even though they don't eliminate uncertainty (usually) it still counts as a measurement because it is less uncertainty than there was before (this is also consistent with Information Theory).
BillGosset 00:23, 20 June 2007 (UTC)
[edit] Analyzing uncertainty does not mean you have no "real" uncertainty
Dave said "When one can analyze errors one does not have real uncertainty!". This is not correct. First, simply analyzing errors is not the same as removing errors. Much of statistics is about quantifying the error you have remaining after a set of observations even if you don't reduce it further, much less eliminate it. When you compute your 90% confidence interval based on a sample, you have literally analyzed uncertainty but you have not removed it. Furthermore, "real" uncertainty still exists even when you analyze error. Its considered classical, not modern physics, to presume that the only uncertainty is from error in observation. In quantum mechanics, the uncertainty is a basic property of particles and is not just a function of observation error.BillGosset 00:29, 20 June 2007 (UTC)
[edit] Remove David Wilkinson reference
Wilkinson is not a notable source on this topic. His book was not based on any prior research in the mathematical understanding of uncertainty and risk. He was trained in education management for law enforcement and later became a management consultant. Wilkinson is simply confused about or probably unaware of the mathematically well-defined meanings of these terms. He makes no attempt to reconcile his contradictory definitions with the proper scientific and mathematical use of the terms - he is simply unaware of these established definitions. He is a layman on this topic and should not cited.Hubbardaie 11:40, 21 June 2007 (UTC)