Richard Jeffrey

Richard Carl Jeffrey (August 5, 1926 – November 9, 2002) was an American philosopher, logician, and probability theorist. He is best known for developing and championing the philosophy of radical probabilism and the associated heuristic of probability kinematics, also known as Jeffrey conditioning.

Life and work

Born in Boston, Massachusetts, Jeffrey served in the U.S. Navy during World War II. As a graduate student he studied under Rudolf Carnap, and Carl Hempel.[1] He received his M.A. from the University of Chicago in 1952 and his Ph.D. from Princeton in 1957. After holding academic positions at MIT, City College of New York, Stanford University, and the University of Pennsylvania, he joined the faculty of Princeton in 1974 and became a professor emeritus there in 1999. He was also a visiting professor at the University of California, Irvine.[2]

As a philosopher, Jeffrey specialized in epistemology and decision theory. He is perhaps best known for defending and developing the Bayesian approach to probability.

Jeffrey also wrote, or co-wrote, two widely used and influential logic textbooks: Formal Logic: Its Scope and Limits, a basic introduction to logic, and Computability and Logic, a more advanced text dealing with, among other things, the famous negative results of twentieth century logic such as Gödel's incompleteness theorems and Tarski's indefinability theorem.

Jeffrey, who died of lung cancer at the age of 76, was known for his sense of humor, which often came through in his breezy writing style. In the preface of his posthumously published Subjective Probability, he refers to himself as "a fond foolish old fart dying of a surfeit of Pall Malls".[3]

Radical probabilism

In frequentist statistics, Bayes' theorem provides a useful rule for updating a probability when new frequency data becomes available. In Bayesian statistics, the theorem itself plays a more limited role. Bayes' theorem connects probabilities that are held simmultaneously. It does not tell the learner how to update probabilities when new evidence becomes available over time. This subtelty was first pointed out in terms by Ian Hacking in 1967.[4]

However, adapting Bayes' theorem, and adopting it as a rule of updating, is a temptation. Suppose that a learner forms probabilities Pold(A&B)=p and Pold(B)=q. If the learner subsequently learns that B is true, nothing in the axioms or probability or the results derived therefrom tells him how to behave. He might be tempted to adopt Bayes' theorem by analogy and set his Pnew(A) = Pold(A | B) = p/q.

In fact, that step, Bayes' rule of updating, can be justified, as necessary and sufficient, through a dynamic Dutch book argument that is additional to the arguments used to justify the axioms. This argument was first put forward by David Lewis in the 1970s though he never published it.[5]

That works when the new data is certain. C. I. Lewis had argued that "If anything is to be probable then something must be certain".[6] There must, on Lewis' account, be some certain facts on which probabilities were conditioned. However, the principle known as Cromwell's rule declares that nothing, apart form a logical law, can ever be certain, if that. Jeffrey famously rejected Lewis' dictum and quipped, "It's probabilities all the way down." He called this position radical probabilism.

In this case Bayes' rule isn't able to capture a mere subjective change in the probability of some critical fact. The new evidence may not have been anticipated or even be capable of being articulated after the event. It seems reasonable, as a starting position, to adopt the law of total probability and extend it to updating in much the same way as was Bayes' theorem.[7]

Pnew(A) = Pold(A | B)Pnew(B) + Pold(A | not-B)Pnew(not-B)

Adopting such a rule is sufficient to avoid a Dutch book but not necessary.[8]. Jeffrey advocated this as a rule of updating under radical probabilism and called it probability kinematics. Others have named it Jeffrey conditioning.

It is not the only sufficient updating rule for radical probabilism. Others have been advocated including E. T. Jaynes' maximum entropy principle and Brian Skyrms' principle of reflection.

Selected bibliography

References

  1. Jeffrey, Richard. "A Proposal to the National Science Foundation for Support of Research on Carnap's Inductive Logic" (PDF). Richard Jeffery's Papers. Special Collections Department, University of Pittsburgh. Retrieved September 17, 2013.
  2. Princeton University Department of Philosophy. "Richard C. Jeffrey". Retrieved July 11, 2017.
  3. pxii
  4. Hacking, Ian (1967). "Slightly more realistic personal probability". Philosophy of Science. 34: 311–325.
  5. Skyrms, Brian (1987). "Dynamic coherence and probability kinematics". Philosophy of Science. 54: 1–20.
  6. Lewis, C. I. (1946). An Analysis of Knowledge and Valuation. La Salle, Illinois: Open Court. p. 186.
  7. Jeffrey, Richard (1987). "Alias Smith and Jones: The testimony of the senses". Erkenntnis. 26: 391–399.
  8. Skyrms (1987)
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.