Talk:History of entropy

From Wikipedia, the free encyclopedia

WikiProject Physics This article is within the scope of WikiProject Physics, which collaborates on articles related to physics.
??? This article has not yet received a rating on the assessment scale. [FAQ]
??? This article has not yet received an importance rating within physics.

Help with this template Please rate this article, and then leave comments to explain the ratings and/or to identify its strengths and weaknesses.

[edit] von Neumann, Shannon, and Entropy

Jheald, in the entropy history section you changed John von Neumann’s quotes around; in a sense, putting words in his mouth that he did not say. I would appreciate it if you would go back and replace the original quotes. Editing is one thing; changing history is another. Thanks: --Sadi Carnot 04:59, 10 April 2006 (UTC)

The quotation first appears in:

  • M. Tribus, E.C. McIrvine, Energy and information, Scientific American, 224 (September 1971).

Variants do appear on the internet, but I believe I have rendered the original correctly.

All best, Jheald 10:43, 10 April 2006 (UTC).

Version according to (Jheald):
Claude Shannon introduced the very general concept of information entropy, used in information theory, in 1948. Initially it seems that Shannon was not particularly aware of the close similarity between his new quantity and the earlier work in thermodynamics; but the mathematician John von Neumann certainly was. "You should call it entropy, for two reasons," von Neumann told him. "In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."
Version according to (John Avery):
An analog to thermodynamic entropy is information entropy. In 1948, while working at Bell Telephone Laboratories electrical engineer Claude Shannon set out to mathematically quantify the statistical nature of “lost information” in phone-line signals. To do this, Shannon developed the very general concept of information entropy, a fundamental cornerstone of information theory. Initially it seems that Shannon was not particularly aware of the close similarity between his new quantity and earlier work in thermodynamics. In 1949, however, when Shannon had been working on his equations for some time, he happened to visit the mathematician John von Neumann, who asked him how he was getting on with his theory of missing information. Shannon replied that the theory was in excellent shape, except that he needed a good name for “missing information”. “Why don’t you call it entropy”, von Neumann suggested. “In the first place, a mathematical development very much like yours already exists in Boltzmann’s statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage.”[1]
Reference
  1. ^ Avery, John (2003). Information Theory and Evolution. World Scientific. ISBN 9812384006. 

Does this revised version sound better? I've cleaned it up a bit; it is sourced by Nobel Prize winning author John Avery, in what is essentially a small textbook on information theory. The chapter in which the above paragraph is copied, word-for-word, contains seven sources by Shannon, from the years '48 to '93. I hardly think that a famous 1949 story by the "father of information theory", only recently appeared in 1971, 22-years after its inception? Either we can work together to make compromise, or we can put both our versions on the entropy talk page to see what other editors think.--Sadi Carnot 16:02, 10 April 2006 (UTC)

If you want to put it to the talk page, that's fine by me. It looks to me that Avery is paraphrasing from memory the quotation in the Tribus article from 32 years earlier. Unless Avery gives a printed source for his wording of the quotation earlier than 1971, I would assume that is what happened.
Secondly, the expression "lost information in phone-line signals" is poor. Shannon entropy is much better thought about as a measure of uncertainty -- the uncertainty which is removed (or could be removed) if the recipient receives particular information.
This is also of course a very reasonable way to think about thermodynamic entropy; though it really took E.T. Jaynes to push that point of view (and its consequences for how we think about ensemble assignment).
In summary: I believe it is appropriate to go with the Tribus version of the quote, which appeared in print 32 years earlier, and IMO also reads better. -- Jheald 17:02, 10 April 2006 (UTC).
The measure of uncertainty idea sounds fine; the remaining points that I rather do not like in your recommended version are newly bolded: being that I cannot imagine a polymath as Neumann using such poor syntax (as in repeating the word “name” twice or declaring that "entropy" is a mystery). Let me know if you can re-word these?--Sadi Carnot 17:18, 10 April 2006 (UTC)
I am not going to re-word a direct quotation from a printed journal. -- Jheald 17:26, 10 April 2006 (UTC).
For context, here is an extended version of the Sci Am quotation:
“My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”
-- Jheald 17:44, 10 April 2006 (UTC).
I added both points of view into the article; at least until someone can get a more date [1948] accurate source.--Sadi Carnot 01:27, 11 April 2006 (UTC)

For the record, another largely similar version of the quote appears earlier in M. Tribus, "Information theory and thermodynamics", in Harold A. Johnson (ed.), Heat Transfer, Thermodynamics and Education: Boelter Anniversary Volume New York: McGraw-Hill, 1964; page 354.

"When Shannon discovered this function he was faced with the need to name it, for it occurred quite often in the theory of communication he was developing. He considered naming it "information" but felt that this word had unfortunate popular interpretations that would interfere with his intended uses of it in the new theory. He was inclined towards naming it "uncertainty" and discussed the matter with the late John Von Neumann. Von Neumann suggested that the function ought to be called "entropy" since it was already in use in some treatises on statistical thermodynamics... Von Neumann, Shannon reports, suggested that there were two good reasons for calling the function "entropy". "It is already in use under that name," he is reported to have said, "and besides, it will give you a great edge in debates because nobody really knows what entropy is anyway." Shannon called the function "entropy" and used it as a measure of "uncertainty," interchanging the two words in his writings without discrimination.

sometimes quoted in the shortened form

"It is already in use under that name and besides it will give you a great edge in debates, because nobody really knows what entropy is anyway"

Jheald 15:23, 11 July 2006 (UTC)

Yes, well in Jeremy Campbell’s 1982 book Grammatical Man – Information, Entropy, Language, and Life we find, on page 22, the beginning of chapter two (“The Noise of Heat”), the following paragraph:
At first, Shannon did not intend to use such a highly charged term for his information measure. He thought “uncertainty” would be a safe word. But he changed his mind after a discussion with John von Neumann, the mathematician whose name is stamped upon some of the most important theoretical work of the first half of the twentieth century. Von Neumann told Shannon to call his measure entropy, since “no one really knows what entropy is, so in a debate you will always have the advantage.”
Being that this is duplicate in wording to my previous separate source mentioned, I am going to assume this version (as bolded) is most likely the original version, particularly because it sounds fluid as though it would happen in a real conversation. Thanks, though, for the source; if you find more, put them here so I can check into the book.--Sadi Carnot 16:38, 11 July 2006 (UTC)
Erm, actually it's not. It's a slight misquotation of the Tribus (1971) Scientific American source — which was the one I originally cited !!  :-) Jheald 17:34, 11 July 2006 (UTC)
Well, whatever the case, I hope all of this talk is getting us somewhere? --Sadi Carnot 03:02, 12 July 2006 (UTC)

[edit] Year entropy was coined?

From my readings, I have come across tree different supposed “dates” to when the word entropy was coined. Mendoza’s Carnot + Claperyon + Clausius compendium states that the word was coined in a 1852 paper; Perrot’s A to Z Dictionary of Thermodynamics states that it was proposed by Clausius in 1868; and Cengel’s textbook on Thermodynamics states that in 1865 he choose to name the property entropy? Additionally, I’ve also read parts of an original copy of Clausius’ 1860 book (at the sacred copies room at the UIC library), with un-cut pages (believe that), and it might have the word entropy in it? I’m still digging around; if anyone has any tips for me leave them here.--Sadi Carnot 01:23, 11 April 2006 (UTC)

FYI, I found the answers I was looking for:

(1850) - stated that an expression was needed to account for the experimental fact that "loss of heat occurs when work is done." (as Carnot had assumed did not occur).
(1854) - he defines the ratio Q/T and calls it "equivalence-value" (so to have relation to Joule's 1843 paper Mechanical equivalent of heat)
(1856) - calls it "equivalence-value of all uncompensated transformations involved in a cyclical process" (and gives it the symbol -N)
(1862) - he relates the integral of dQ/T to something he calls "disgregation" of the body having relation to arraignment of the molecules of the working body
(1865) - lets dS = dQ/T and first calls S the "transformation-content" of the working body, but then changes it to "transformational-energy", or entropy, so to have similarity to the word energy.

See:

  • Mechanical Theory of Heat – Nine Memoirs on the development of concept of "Entropy" by Rudolf Clausius [1850-1865]

Adios:--Sadi Carnot 15:19, 3 September 2006 (UTC)

[edit] Repetition?

Just wondering if the content is repetitive in places, especially in the sections "Historical Definitions" and "Classical Thermodynamic Views". This gives an article that flows back and forth. As it is an article on history, maybe having it present in a chronological frame is an good option?