User talk:Ancheta Wis/t
From Wikipedia, the free encyclopedia
Stephen Toulmin (1967) "The Astrophysics of Berossos the Chaldean", Isis, Vol. 58, No. 1 (Spring, 1967), pp. 65-76 [1]
European Neural Network Society 2002
Google radar Bayes Kolmogorov signal processing Wiener filter Google counterfactual epistemic probability defeasible
George E. P. Box (1978) Statistics for Experimenters ISBN 0-471-09315-7
In the past few centuries, some statistical methods have been developed, for reasoning in the face of uncertainty, as an outgrowth of methods for eliminating error. This was an echo of the program of Francis Bacon's Novum Organum. Bayesian inference acknowledges one's ability to alter one's beliefs in the face of evidence. This has been called belief revision, or defeasible reasoning: the models in play during the phases of scientific method can be reviewed, revisited and revised, in the light of further evidence. This arose from the work of Frank P. Ramsey[1], John Maynard Keynes[2], and earlier, William Stanley Jevons' work[3] in economics. ; one's individual actions Alan Hájek, "Scotching Dutch Books?" Philosophical Perspectives 19 Per Gunnar Berglund, "Epistemic Probability and Epistemic Weight" The rise of Bayesian probability
- The deviation from truth was first quantified two centuries ago, by mathematical modeling of error as the deviation from the mean of a normal distribution of observations. Gauss used this approach to prove his method of least squares, which he used to predict the position of the asteroid Ceres. A normal distribution (or gaussian) might then be used to characterize error in observations. William Gosset's, or Student's t is a well-known statistic for error. Other probability distributions have been formulated: the Poisson distribution (in which the standard deviation is equal to the mean), the exponential distribution, and so forth. Gauss' techniques may well be used to monitor the close encounter of Earth with asteroid Apophis on April 13, 2036. The current probability of hitting our planet on this date is 1 in 45,0000.
- Originally, probability was formulated as a method for handling the risk of betting[4]. This viewpoint was systematized by decomposing its components into independent events by Andrey Kolmogorov's axioms for probability theory. [5] Statistical theory had its origins in probability. Karl Pearson (1857 – 1936) established mathematical statistics along with other important contributors to statistics. Inferential statistics textbook
- In a parallel effort, Leibniz, Pascal, Babbage and Jevons' algorithmic thinking stimulated the development of mechanical computing, which gave rise to entire classes of professional careers. Before the mid-twentieth century, computer was a person's job title; women were able to pursue professional careers as computers, at a time when other professions were unavailable to them, before the rise of computing hardware in the mid-twentieth century.
These statistical and algorithmic approaches to reasoning embed the phases of scientific method within their theory, including the very definition of some fundamental concepts.
- Thomas Bayes (1702 — 1761) started a method of thinking (defeasible reasoning) which acknowledges that our concepts can evolve from some ideal expectation to some actual result. In the process of learning some condition, our concepts can then keep pace with the actual situation. Unlike classical logic, in which propositions are evaluated true, false, or undecided, a Bayesian thinker would assign a conditional probability to to a proposition. This is called Bayesian inference: "The sun has risen for billions of years. The sun rose today. With high probability, the sun will rise tomorrow.".
- Frank Plumpton Ramsey (1903-1930) formulated a practice, conveniently understood by betting. In this practice (known as the pragmatic theory of truth), one assigns a probability, as a measure of partial belief in a subjective statement, to a subjective proposition which one, as the interested individual, can understand best. Ramsey thus provided a foundation for Bayesian probability, which is a direct method for assigning posterior probabilities to any number of hypotheses directly. See Ramsey biography, Ramsey theory. [6]
- In the past 50 years, machines have been built which utilize this type of theory. (See for example Dempster-Shafer theory.) This method rests on the notion of prior and posterior probabilities of a situation or event. A prior probability is assigned prior to an event's occurrence or known existence, while a posterior probability is to be assigned after an event is known.
- In statistical theory, experimental results are part of sample space, as are observations. estimation theory to process observations, perhaps in multiple comparisons. Signal processing can then be used to extract more information from observations.
- hypothesis testing, part of mathematical statistics. decision theory can be used in the design of experiments to select hypotheses using a test statistic Omnibus test Behrens-Fisher problem Bootstrapping (statistics) Falsifiability
- design of experiments Ronald Fisher (1890 – 1962) Maximum likelihood estimation Fisher's method for combining independent tests of significance level α Statistical significance Null hypothesis Type I error, Type II error confidence level over a confidence interval
- error has a corresponding place in computation; in this subject, a calculation has some error specified by the number of digits or bits in a result, with the least significant figure discounted by an error tolerance band. Even in financial fields, where an account is best known to the penny, allowance for error is made by writeoffs and losses.
- The stages of scientific method usually involve formal statements, or definitions which express the nature of the concepts under investigation. Any time spent considering these concepts will materially aid the research. For example, the time spent waiting in line at a store can be modelled by queueing theory. The clerk at the store might then be considered an agent. The owner of the store and each customer might be considered to be principals in a transaction.
In summary, scientific thought as embodied in scientific method, has moved from reliance on Platonic ideal, with logic and truth as the sole criterion, to its current place, centrally embedded in statistical thinking, where some model or theory is evaluated by random variables, which are mappings of experiment results to some mathematical measure, all subject to uncertainty, with an explicit error.