Talk:Entropy
From Wikipedia, the free encyclopedia
Archives |
/Available Energy '03-(Nov)'05 /Disorder '04-(Nov)'05 |
[edit] New intro
Rudolf Clausius' 1879 book (2nd Ed.) Mechanical Theory of Heat (see page: 107 beginnings of entropy discussions) is now available in Google books. Thus, I have started updating the intro to the correct presentation, i.e. in Clausius' own words. --Sadi Carnot 21:36, 30 July 2007 (UTC)
I have reverted your last change which was:
- "In physics, entropy, symbolized by S, from the Greek τροπή meaning "transformation", is a mathematical function that represents the measure of the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state.[1] In short, entropy is a variable that quantifies the affects of irreversibility in natural processes."
Your paragraph is all true, but it is quite unintelligible to the average reader and is far to concise. You also left the lead far too short and not being a summary of the whole article. However I recognise that you probably were going to add something more. Also entropy is not anymore what Clausius wrote. We should be describing entropy as it is now understood and used not its historical roots. Please stop and discuss your changes here. --Bduke 23:28, 30 July 2007 (UTC)
[edit] Current lead
Bduke, all I did was move the bulk of the lead to an "overview" section. The current lead paragraph (which is completely un-referenced), show below, is filled with errors (especially the etymology):
- The concept of entropy (Greek: εν (en=inside) + verb: τρέπω (trepo= to chase, escape, rotate, turn) [wrong (τροπή meaning "transformation", )]) in thermodynamics is central to the 2nd law of thermodynamics, which deals with physical processes and whether they occur spontaneously [wrong (the measure of spontaneity is "free energy" as per the combined law of thermodynamics)]. Spontaneous changes occur with an increase in entropy [wrong (only for isolated systems)]. Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed [close (in some cases, but no reference)]. In contrast, the first law of thermodynamics deals with the concept of energy, which is conserved [correct (but does not explain what the connection is to entropy)].
I'll move the bulk of the lead back in, but I'm still correcting all this mess; for instance, all these suppositions need to be referenced. As to your statement "entropy is not anymore what Clausius wrote", there is some truth to this (in terms of verbal terms), but entropy, at its core, is what he wrote (in conceptual and mathematical terms). Now that the original paper is available, I intend to include a blend of this, as well as modern views, in the lead. No need to do any further reverting, please work together on this. --Sadi Carnot 07:05, 31 July 2007 (UTC)
-
- It seems good now. I'm guessing, however, that if the lead keeps growing, some of it will have to be moved into a new "overview" section (which is what I was attempting to do before), as per WP:LEAD which states that opening section "should contain up to four paragraphs, should be carefully sourced as appropriate, and should be written in a clear, accessible style so as to invite a reading of the full article". --Sadi Carnot 07:16, 31 July 2007 (UTC)
Sadi, please discuss it more here and let us see what others think. I do not agree one little bit that it "seems good now", but I'm not going to revert. The problem is that the new first paragraph is NOT "written in a clear, accessible style so as to invite a reading of the full article". It will be a complete off-put to most readers, particularly those who are coming to it from a discipline other than physics but realise that this is a part of physics they need to know about. This has been a long term problem with this article and particularly its lead, but I just do not seem to be able to convince you and others. I can not work together with you on it, because it is the very opposite of what I would like to see. Keep the lead simple. Let it attract people with very different views of why they came to read it. Make in intelligible. --Bduke 07:38, 31 July 2007 (UTC)
- I agree with you, any subject should be written in the manner that best conveys information in a digestible manner. One should not, however, bend, twist, misconstrue or even misrepresent basic science and logic for the sake of readability. Presently, to review as things currently stand, we are debating the first two sentences in the article. Please explain what your fuss is about (with these two sentences)? All I did was to correct wrong information and to add a reference. --Sadi Carnot 08:00, 31 July 2007 (UTC)
[edit] Lead comparison
To give you a comparative idea of why the lead is in “good shape” now, below is the current lead for the energy article (which there seems to be no issues with):
[edit] Energy
- In physics, energy (from the Greek ενεργός, energos, "active, working")[2] is a scalar physical quantity, often represented by the symbol E,[3] that is used to describe a conserved property of objects and systems of objects.
[edit] Entropy
- In physics, entropy, symbolized by S, from the Greek τροπή meaning "transformation", is a mathematical function that represents the measure of the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state.[4]
There really is no need to make a big fuss over entropy; it’s basically the same thing as energy, only in a non-conservative sense. If you think the average reader is going to be “put off” by this sentence, than you might as well go over to the energy article, and post a note on that talk page as well, because I see no difference between these to sentences, in terms of difficultly. In short, the first sentence has to define the term. This is the way it is in all science articles. --Sadi Carnot 08:08, 31 July 2007 (UTC)
I think the lead to energy could be improved somewhat, but it really is not as difficult or off-putting as the current one to "Entropy". I do not think the first sentence has to define the term. It may do so, but often it is better to say in general terms what it is about, where it is used etc. and define it later. This does not mean being inexact or misleading. I do not want to sound patronising, but I think it is clear that you have never taught entropy or energy to people who are somewhat apprehensive about the topics. If you had you would see quite clearly what to me "the fuss is about". It has to attract people. It has to be simple, so readers can decide whether they need to get deeper. It currently does not do these things. I am busy with other things, so I am going to leave it to you. When you have done, put it to peer review and try to get it to featured article status. That will bring many others to comment on the article. --Bduke 09:32, 31 July 2007 (UTC)
- (Edit conflict note: this was written before seeing Bduke's post above) IMO, the lead sentence is as clear as mud. What is transformation-content? What is "dissipative energy use"? That 19th century quote can be impenetrable to the modern reader. I'd rather have a definition like this one (although it's not perfect either):
- "Quantity the change in which is equal to the heat brought to the system in a reversible process at constant temperature divided by that temperature. Entropy is zero for an ideally ordered crystal at 0 K. In statistical thermodynamics, S = k ln W, where k is the Boltzmann constant and W the number of possible arrangements of the system."[1]
- This definition has the deficiency of not saying what entropy is good for or what it "is", but it is concrete and clear. Saying what entropy "is" gets into issues of interpretations or analogies, of which everyone has a favorite. --Itub 09:41, 31 July 2007 (UTC)
[edit] Dava souza's revert
Sadi, your enthusiasm for obscure historical definitions is noted, but this article is about informing newcomers to the subject and Bduke's considerable expertise on the subject has produced a much better lead than the proposed change, so I've restored it. .. dave souza, talk 09:56, 31 July 2007 (UTC)
- Dave, you reverted several of my edits (corrections to errors) just now, not just the definition. I'm flexible on this, however, I want to seen a reference (or several) in the opening sentence and I don't want to see sloppy (incorrect) sentences. The sentence "spontaneous changes occur with an increase in entropy", is only correct in isolated systems; the novice reader will think it applies to all situations. The etymology is wrong to; I added an original source reference and you have reverted this too. Also, the lead needs to be four concise paragraphs, and the rest moved to an overview section. Please be considerate of my editing efforts. If you want to blend in a new reference to make it easier to read then do so. The lead you reverted to:
-
- The concept of entropy (Greek: εν (en=inside) + verb: τρέπω (trepo= to chase, escape, rotate, turn)) in thermodynamics is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy.
- is completely incorrect, e.g. see spontaneous process. This is what I am trying to clean. --Sadi Carnot 16:48, 31 July 2007 (UTC)
-
- Spontaneous process is misleading at best and certainly needs work. It does not clearly say that ΔS is for the system only and ΔH / T is also an entropy term - for the surroundings. ΔG is really a measure of the total entropy change. --Bduke 22:32, 31 July 2007 (UTC)
-
-
- Well, we can discuss whether the article should sharpen its discussion on the difference between the entropy of the universe and the entropy of the system. But, as the article spontaneous process makes quite clear, chemists define the word spontaneous to mean a process in which the entropy of the universe increases - ie a process allowed by the 2nd law. Jheald 17:17, 31 July 2007 (UTC)
-
- (As a physicist, that usage always makes me uncomfortable -- it seems misguided to me to call a reaction spontaneous, if in practice it doesn't occur spontaneously, because the reaction barrier is too high. But who am I to argue with chemists in full flood?) Jheald 17:20, 31 July 2007 (UTC)
- I kind of agree, but the distinction between thermodynamic control and kinetic control of a reaction is a useful one. --Bduke 22:32, 31 July 2007 (UTC)
- (As a physicist, that usage always makes me uncomfortable -- it seems misguided to me to call a reaction spontaneous, if in practice it doesn't occur spontaneously, because the reaction barrier is too high. But who am I to argue with chemists in full flood?) Jheald 17:20, 31 July 2007 (UTC)
[edit] Etymology
On the subject of the derivation of the word entropy, in Greek τροπή comes from τρέπω just like for example in English "cessation" comes from "cease". The verb τρέπω is the root word, and "chase, escape, rotate, turn" gives a good sense of what it means. The noun τροπή means a τρέπω-ing, hence a turning, a changing, a transformation.
I have to agree, in the strongest terms, with Itub above, when he writes that your proposed lead sentence "is as clear as mud. What is transformation-content? What is "dissipative energy use"?" These terms should be left in the 19th century. He is so right, when he writes, they are simply "impenetrable to the modern reader". I have reverted this ancient cruft, and would do so again without hesitation. Jheald 17:43, 31 July 2007 (UTC)
- Jheald, I was the one that added the original etymology (from Perrot's A to Z Dictionary of Thermodynamics), and now that I've seen the 2nd Edition of the book (page 107):
-
-
- Clausius, Rudolf. (1879). Mechanical Theory of Heat (pg. 107), 2nd Edition. London: Macmillan & Co.
-
- I have corrected it to how Clausius coined it. Thanks you: --Sadi Carnot 17:55, 31 July 2007 (UTC)
-
-
- The difference is that εν + τρέπω understood as the "chasing/ escaping/ rotating/ turning" "inside" the system actually gives quite a helpful steer towards understanding what entropy is. "Transformation" doesn't. Jheald 18:02, 31 July 2007 (UTC)
-
-
-
- A measure of the unavailability of a system’s energy to do work. This actually is rather unhelpful. TR S is a measure of the energy unavailable to do work. The dependence on the reservoir temperature TR is fundamental. If TR was zero, then all the energy would be available to do work. It therefore is not helpful to suggest that S on its own is a measure of the unavailability of a system’s energy to do work. Jheald 18:07, 31 July 2007 (UTC)
-
-
-
- Oh, and while you're at it, please learn enough about information theory to understand why saying Shannon entropy is "attenuation in phone-line signals" is imbecilic. Jheald 18:12, 31 July 2007 (UTC)
-
I added a second ref note per your request:
- The etymology of entropy, in modern terms, according to Perrot’s A to Z of Thermodynamics, can be interpreted to mean, “from the Greek root εντροπη, the act of turning around (τροπη, change of direction), implying the idea of reversibility”.
I hope this helps. As to the new 2005 Oxford Dictionary of Physics definition, do you really have to complain about every reference? First Clausius is to historic, now Oxford it too unhelpful. Give me a break. I'm only trying to add references to the article so to give it credibility, rather than original research. As to Shannon, fix it if you know of a better wording. --Sadi Carnot 18:21, 31 July 2007 (UTC)
- As to Shannon, I was rather hoping you might go away and actually learn something, so you don't continue to inflict nonsense like this any more.
- As for the opening definitions, no I'm not going to give you a break. Settling for misleading is not acceptable. "A measure of the unavailability of a system’s energy to do work" is horribly misleading, because that unavailablity depends utterly on the reservoir temperature.
- Finally, connecting τροπη with the idea of reversibility is a spectacularly unhelpful intuition, even by your standards. What is valuable about the link with τρέπω = "chase, escape, rotate, turn" is that it gives some idea of internal molecular confusion. No, that's not what Clausius was thinking of when he coined the phrase. But it's the most valuable connection today. Clausius's original etymology frankly isn't helpful for the intro. Jheald 18:39, 31 July 2007 (UTC)
[edit] Jheald’s comments
Jheald, let me get this straight: from your point of view, I’m an imbecile and you want me to go away? --Sadi Carnot 02:36, 1 August 2007 (UTC)
-
-
- No, but from time to time, like all of us, you may write things which make you look clueless. At which point, the best solution is to get a clue. I told you 18 months ago that this sort of statement about information entropy was misconceived, and yet you still trot it out. Jheald 08:22, 1 August 2007 (UTC)
-
- In any event, thanks for the nice comments, I've added them to my user page. --Sadi Carnot 03:21, 1 August 2007 (UTC)
[edit] Economic entropy
I have changed economic entropy from being a quantitative value to a semi-quantitative value. I would go further and call it qualitative, but people might disagree. I fail to see how it can be quantitative without a mathematical definition, which is stated by the article by filing it under sociological definitions. I would argue that quantitative measurements must be of a known quantity if they are to be named as such. Thanks User A1 11:26, 6 August 2007 (UTC)
[edit] Entrophy and the relative number of states
Ω, the number of microstates, in S = k ln Ω might be better interpreted as a relative number of states which would be a dimensionless quantity for which the logarithm would be defined.
On p. 24 of Wolfgang Pauli's Statistical Mechanics (Vol. 4 of Pauli Lectures on Physics) he comments,
"The statistical view also permits us to formulate a definition of entrophy for nonequilibrium states. For two states, 1 and 2 we have
S2 - S1 = k log(W2/W1);
leaving the additive constant unspecified, we obtain
S = k log W.
Because of the logarithm, and because the probabilities of independent states multiply, the additivity of entrophy is maintained." --Jbergquist 18:41, 2 October 2007 (UTC)
[edit] Gibbs entropy as fundamental but not defined?
Under "Miscellaneous definitions", "Gibbs entropy" is described as being the "usual statistical mechanical entropy of a thermodynamic system". However, Gibbs entropy does not appear to be defined in this article, and the linked article on "Gibbs entropy" does not define some of the terms used. 68.111.243.96 20:18, 17 October 2007 (UTC)
[edit] Entropy (disambiguation)
Editors of this page might like to look over the recent to-and-fro and Entropy (disambiguation). User:Thumperward seems dead set to (IMO) make the page harder to use. Compared to eg this edit, he seems determined to
- remove the link to Introduction to entropy
- remove links directing people to find additional entropy articles in the categories
- reduce the structuring of the page between thermodynamic entropy and information entropy.
-- all of which (IMO) are mistakes. Anyhow, there have been a series of reverts and counter-reverts (I've now had my 3 for the day), and there's discussion on the talk page there, if anybody wants to have a look. Jheald 13:52, 18 October 2007 (UTC)
- Never mind, we seem to have come to agreement. Edit war over :-) Jheald 15:47, 18 October 2007 (UTC)
[edit] "calculated using the multiplicity function" ????
And if I click on the wiki link or multiplicity function I see the expression for a system of N noninteracting spins :) Also, we should avoid using misleading examples of a system of which the energy levels are exactly degenerate. It is better to define Ω as F. Reif does in his textbook: is the number of energy eigenstates with energy between E and E + δE, where δE is a macroscopically small energy interval. The entropy defined in this way depends on the choice of δE, but this dependence becomes negligible in the thermodynamical limit. It cannot be set to zero, because then for generic systems Ω = 1, and the entropy becomes identical to zero.
Basically what happens is that if you specify the energy of a system with infinite accuracy, then there can be only one microstate compatible with that energy specification. This entropy is the so-called fine grained entropy, while the entropy defined with the nonzero δE is the coarse grained entropy. Count Iblis 15:24, 23 October 2007 (UTC)
[edit] Clausius Inequality
Shouldn't the inequality be a less than or equal to sign rather than a greater than or equal to sign? That should probably be fixed. --Lee —Preceding unsigned comment added by 24.3.168.99 (talk) 05:39, 21 November 2007 (UTC)
No - the inequality is correct. Read the statement just above the equation where it states that the result should be positive or zero for the extreme situation. This is a basic statement of the 2nd Law of Thermodynamics. But thanks for checking. PhySusie (talk) 13:09, 21 November 2007 (UTC)
[edit] I wish I knew how to add things to my own comments
My mistake on that one, haha. I overlooked the part where the article explained heat transfer from the body was positive (every class I've ever taken has had it as negative). That's why I'm used to seeing the Inequality with a less than or equal to sign. Thanks again! —Preceding unsigned comment added by 24.3.168.99 (talk) 06:45, 23 November 2007 (UTC)
[edit] GA Sweeps (on hold)
This article has been reviewed as part of Wikipedia:WikiProject Good articles/Project quality task force in an effort to ensure all listed Good articles continue to meet the Good article criteria. In reviewing the article, I have found there are some issues that may need to be addressed, mainly whether having a separate "Introduction to" article is a sufficient replacement for writing at a general level. In my view it is not, but I'm happy to be convinced otherwise, or post the article at Good Article Reassessment for other opinions.
There are a few other fixes listed below that are needed to keep Entropy at a good article standard.
GA review – see WP:WIAGA for criteria
I was initally of two minds about using an "introduction to" article as a general audience work around. On the one hand, yes, it is a very good way to avoid the difficulties inherent in explaining advanced concepts at a basic level, but on the other hand, it bears similarities to a "criticisms of" POV fork. After reading through the lead of the article, I think there's a good reason to merge the "introduction to" article into this one: when writing for a general audience, one tends to focus more on clarity than absolute precision, and it is this clarity that is crucial for the lead of this sort of highly techinical article. Also, most of the sections in this article are summaries of other articles, but still focused at a very high level. It would be better if the summaries were geared to a general audience and the high level material left for the main articles.
- Is it reasonably well written?
- A. Prose quality:
- The lead could be made clearer and more focussed, especially the first paragraph. See comment after review.
- B. MoS compliance:
- There are some Manual of style problems in addition to the general audience discussion above. They're only small errors, but there are quite a few a of them. For example, remember to avoid slipping into textbook style and using "we" in derivations, and that punctuation after a math tag must go inside. See Wikipedia:Manual of Style (mathematics).
- A. Prose quality:
- Is it factually accurate and verifiable?
- A. References to sources:
- B. Citation of reliable sources where necessary:
- There are a number of unsourced facts, especially in the history section. The GA criteria specify that at a minimum, every statement that could be contested must sourced with an inline citation.
- C. No original research:
- The ice melting example has to count as original research unless it is sourced. As it's apparently used in many textbooks, this won't be hard to fix.
- Is it broad in its coverage?
- Is it neutral?
- Is it stable?
- Does it contain images to illustrate the topic?
- A. Images are copyright tagged, and non-free images have fair use rationales:
- B. Images are provided where possible and appropriate, with suitable captions:
- Overall:
Regarding the lead, it contains 7 different definitions of entropy:
- [it] is a measure of the unavailability of a system’s energy to do work.
- is a measure of the randomness of molecules in a system.
- entropy is a function of a quantity of heat which shows the possibility of conversion of that heat into work.
- [is] the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state
- [it has] often been defined as a change to a more disordered state at a molecular level.
- has been interpreted in terms of the "dispersal" of energy.
- is defined by the differential quantity dS = δQ / T
While I'm sure they're all true, it makes the lead seem very cluttered, and I come away with a cluttered idea of what entropy is. I think that if the lead were refactored such that each paragraph had a clear, single focus, it would improve the article dramatically.
Feel free to drop a message here if you have any questions, and many thanks for all the hard work that has gone into this article thus far! --jwandersTalk 03:58, 19 February 2008 (UTC)
Delisted from GA. See review directly above. --jwandersTalk 22:28, 20 February 2008 (UTC)
[edit] other sections on entropy: economics, and as metaphor
There should be some other sections on entropy as it appears in macroeconomic theory, and as it has been used outside of science as a metaphor.
The first: I do not mean the technical (information theory) use of entropy as a measure of this or that quality of information generated about one or another economic variable. I mean: there's a revolution happening in economics, as the old Newtonian mechanism is being replaced by a macro perspective that understands that economic processes are one-way flows that happen in time, in which scarce (low entropy) matter and energy is taken up into the economy and degraded (high entropy) matter and energy is extruded/exhausted/discarded. I wrote a 'graph on this that seems to have disappeared. Maybe I didn't follow directions or rules?
The second: the idea of entropy has been widely used by poets, historians, fiction writers, thinkers of all kinds, some of whom understood it, some of whom didn't. Still, a comprehensive article on the subject could give a brief survey of these uses. I wrote some text on this, too, and it isn't there today. Problem? —Preceding unsigned comment added by 128.252.254.30 (talk) 19:24, 1 March 2008 (UTC)
[edit] Lede context
Hello,
A few questions, which I post without any
- is thermodynamics a branch of physics? I always thought of it as a branch of chemistry - particularly the statistical mechanics considerations, though I can see how it could go either way.
- Is entropy really at a purely thermodynamic property? I would have thought that entropy is a statistical property which finds use in fields such as thermodynamics, information theory, etc.
Maybe I am just favouring statistical mechanics... Thanks User A1 (talk) 22:59, 25 March 2008 (UTC)
- Ad 1. Our Thermodynamics article starts like this: "Thermodynamics (...) is a branch of physics that studies the effects of changes...". The article History of entropy described how the notion was already well developed (see also Classical thermodynamics and Entropy (classical thermodynamics)) before the statistical explanation was developed (see Statistical thermodynamics and Entropy (statistical thermodynamics)).
- Ad 2. Entropy is not a purely thermodynamic concept, although it originally was, and the statistical definition used in thermodynamics is specific to that field. However, as it is, it is the thermodynamic concept that is described by this article. I am in favour of renaming this article Entropy (thermodynamics), a name that currently redirects here, as does Thermodynamic entropy. See also the discussion raging at Talk:Entropy (disambiguation). --Lambiam 21:40, 26 March 2008 (UTC)
[edit] Requested move
Entropy → Entropy (thermodynamics) — The article appears to discuss thermodynamics only, and fails to review entropy in other branches of physics, information science and mathematics. —linas (talk) 04:14, 27 March 2008 (UTC)
Once again, the stupidity of masses rears its ugly head, as the above exhibits in spades. At the risk of being uncivil, I say "fuck wikipedia". If this is what the cornhole editors with their heads stuck up their asses want, this is what they get. linas (talk) 02:16, 8 April 2008 (UTC)
[edit] Error in Explanation
'then entropy may be (most concretely) visualized as the "scrap" or "useless" energy'
Usually in an article discussing a useful combination of more basic physical quantities, the units of the item are given. In this article they are not explicitly covered. Big mistake. And it leads to incorrect statements like the one above. Entropy is not energy. The term energy has a whole lot of baggage that comes with it, and to suggest that entropy carries the same baggage (say like conservation) contributes to a gross misunderstanding of what is going on. I hope authors/editors will be much more careful. Properly presenting the ideas of physical chemistry requires much more rigor than present in this article. blackcloak (talk) 05:32, 7 June 2008 (UTC)
- Thanks for the comment, this article has been subject to a sort of tug of war between various perceptions of how to explain a difficult concept involving advanced mathematics in a simple way accessible to the layman. This earlier version was edited by an educator, and may be nearer what you were looking for. The article's gone through numerous intermediate stages, as in this version, and the lead has been stripped down to the point where it's probably missing out on essentials while still including misleading cruft. Rather beyond me, but your assistance in a rewrite will be greatly appreciated. Note, of course, that thermodynamic entropy applies to more than physical chemistry. . dave souza, talk 08:14, 7 June 2008 (UTC)
-
- Well, today the average lay person is much more familiar with information theoretical concepts because many people have a computer these days (certainly those people who visit this page :) ). So, we can exlain the rigorous formulation much more easily than, say, Landauer could half a century ago. Why can't Maxwell's demon be effective? Today that's almost a no brainer to a ten year old. Count Iblis (talk) 13:29, 7 June 2008 (UTC)
I rarely have time these days to think about this article, but I want to make a comment in response to blackcloak. I suggest that the urge to follow "the ideas of physical chemistry requires much more rigor than present in this article" does more harm than good and probably explains why students in physical chemistry never really understand entropy. What is needed at least at first is not rigor but clarity, so the reader can see what entropy is actually about and why they need to learn about it. Rigor can follow later. I am not of course suggesting that the "clarity" phase should be false, but it does not need to be rigorous. It also needs to take into account that many students have a poor background in mathematics. --Bduke (talk) 22:36, 7 June 2008 (UTC)
- On the other hand, claiming entropy is "scrap or useless energy" is not clear, and is not good. It does not help understanding if entropy is confused with energy. The unusable energy is TR S, where TR is the temperature of the coldest accessible reservoir. Jheald (talk) 20:08, 9 June 2008 (UTC)
-
-
- Why don't we say that the entropy of a system is the maximum amount of information you could theoretically store in the system without affecting its macroscopic state? This definition is perfectly understandable to most lay people. Count Iblis (talk) 02:45, 10 June 2008 (UTC)
- It is totally confusing to someone who comes across the term "entropy" in connection with chemistry. But even wider, where on earth do you get the idea that it "is perfectly understandable to most lay people"? Can you back that up with a source.--Bduke (talk) 23:01, 10 June 2008 (UTC)
- Well, the problem is that people are taught thermal physics in the wrong way in high school. At university part of our work is to let the students unlearn what they learned in high school. Entropy is fundamentally an information theoretical or statistical concept, just like heat, temperature etc. are. If we just say that entropy is related to heat and temperature, we aren't explaining anything.
- It is totally confusing to someone who comes across the term "entropy" in connection with chemistry. But even wider, where on earth do you get the idea that it "is perfectly understandable to most lay people"? Can you back that up with a source.--Bduke (talk) 23:01, 10 June 2008 (UTC)
- Why don't we say that the entropy of a system is the maximum amount of information you could theoretically store in the system without affecting its macroscopic state? This definition is perfectly understandable to most lay people. Count Iblis (talk) 02:45, 10 June 2008 (UTC)
-
-
-
-
-
- I'm not saying that we should explain everything mathematically, that's not necessary. Compare e.g. how Chaitin explains Gödel's theorem to lay people. The information theoretical approach clearly works very well here, precisely because lay people are familiar with computers, computer memory etc. etc.. Count Iblis (talk) 21:23, 11 June 2008 (UTC)
- I do not know where you come from, but most of the students I have taught physical chemistry to in the last few decades had not even studied physics at High School and had not come across entropy before. I see you are a physicist. It seems to me that what is familiar and obvious to your students is very far from being so to everyone else. I give up. I just do not see a way forward. This article will continue to be dominated by physicists and it will continue to be totally unclear and unhelpful to anybody else. --Bduke (talk) 23:17, 11 June 2008 (UTC)
- We do have to explain everything from the start to the students. I don't really believe that genuinly interested people who are willing to learn can fail to underststand something as simple as entropy. But they do have to be open to the idea that their intuitive ideas about entropy, heat and temperature may be wrong.
- I do not know where you come from, but most of the students I have taught physical chemistry to in the last few decades had not even studied physics at High School and had not come across entropy before. I see you are a physicist. It seems to me that what is familiar and obvious to your students is very far from being so to everyone else. I give up. I just do not see a way forward. This article will continue to be dominated by physicists and it will continue to be totally unclear and unhelpful to anybody else. --Bduke (talk) 23:17, 11 June 2008 (UTC)
- I'm not saying that we should explain everything mathematically, that's not necessary. Compare e.g. how Chaitin explains Gödel's theorem to lay people. The information theoretical approach clearly works very well here, precisely because lay people are familiar with computers, computer memory etc. etc.. Count Iblis (talk) 21:23, 11 June 2008 (UTC)
-
-
-
-
-
-
-
-
-
- The reason why people find physics difficult is because we don't teach it properly until the students go to university. Just think about how well you would read and write English if you were not taught to read and write until you were 18 years old. Now, if our reaction to this problem is to dumb thing down even more we are only going to make the problem worse. We have to keep in mind that wikipedia is also read by many children in primary and high school. They would benefit from being exposed to real physics instead of the dumbed down physics stuff they are taught in school. Count Iblis (talk) 02:40, 12 June 2008 (UTC)
-
-
-
-
-
-
- BDuke, Just thought I would wade in here. If you want to make progress, one suggestion would be to create a page in your own user-namespace, eg User:Bduke/entropy and then use that to construct, what you believe to be a good modification - that way you can actually point at something and say "this is a better explanation; what do you think" rather than "Currently the way the article does is wrong, we should do it a better way". More likely you will get a more enthusiastic response from other editors. see WP:BOLD. On the other hand, this is more work :( - Can't win em all, huh?User A1 (talk) 00:40, 12 June 2008 (UTC)
- I actually did that long ago, but under a different title which I forget. I deleted it. It lead to a rewrite of the intro para as Dave Souza mentions in the second para above in this section. I had other things to do and it just reverted back to where it is now. It is just too hard unless others recognise that we do have a real problem with this article and many others. I just do not have the time to fight this alone. --Bduke (talk) 01:01, 12 June 2008 (UTC)
- BDuke, Just thought I would wade in here. If you want to make progress, one suggestion would be to create a page in your own user-namespace, eg User:Bduke/entropy and then use that to construct, what you believe to be a good modification - that way you can actually point at something and say "this is a better explanation; what do you think" rather than "Currently the way the article does is wrong, we should do it a better way". More likely you will get a more enthusiastic response from other editors. see WP:BOLD. On the other hand, this is more work :( - Can't win em all, huh?User A1 (talk) 00:40, 12 June 2008 (UTC)
Defining entropy as the maximum amount of information you could theoretically store in the system without affecting its macroscopic state is not understandable to most lay people IMO. That definition only makes sense if the reader is acquainted with a quite technical meaning of "information", which takes the reader who doesn't know it in a nearly circular path of confusion. It is also counterintuitive to suggest that a gas "holds" more information than a solid, for example. What do you mean by "hold"? Why are hard drives not gaseous then? ;-) Like I suggested above already, I think it is best to start with a clear and unambiguous definition such as [6], even if it doesn't explain what entropy is good for or what it "is". The analogies and examples can come later. --Itub (talk) 08:54, 12 June 2008 (UTC)