Talk:Entropy/Archive7
From Wikipedia, the free encyclopedia
Evolution, order, and entropy
Hi, owing to the objections some have expressed in regards to entropy’s association with “disorder”, I have spent almost a month now adding over a dozen new references, uploaded 3 new images, adding new formulas, and added both the history and modern-day conceptions of how “order” and “disorder” are utilized in current science. To be clear, there is no underlying motive or conspiracy here; these are well-established theories and methods of presentation. Many of us have degrees in this field and do research on this topic. Google search results clearly show that there are more than 12-million articles written with discussion on entropy, order, and disorder. Thus, please, if you have some type of disagreement or objection to entropy being discussed and presented in terms of “disorder”, possibly based on some sort of entropy/creationism conflict or beginner chemistry issues, than discuss this somewhere else. We are only here to present topic information, as it is, in an unbiased neutral point-of-view manner. I certainly apologize if I have failed to present entropy in a crystal clear, easily discernable manner. Yet, entropy itself, in the words of many scientist and authors, is one of the most puzzling concepts in all of science. Thus, if, in our effort to facilitate topic understanding, we unintentionally result to bend phrasings and presentation styles, almost to the effect that we are assembling original presentations using our own personal conceptions, then we are in fact no longer on topic. To highlight the subtle point I am digging at, in the words of the great Nobelist Gilbert Lewis, from the Oxford Dictionary of Scientific Quotations:[1]
“ | There is always the danger in scientific work that some word or phrase will be used by different authors to express so many ideas and surmises that, unless redefined, it loses all real significance. | ” |
Thanks everyone for your help. Later: --Sadi Carnot 16:04, 13 November 2006 (UTC)
- You've certainly done a lot to show cited use of "disorder", and I fully accept that the term is common. Of course, in your zeal you may have taken some citations out of context, as in the "Tanakhe" example, and I'd hope that this can be carefully checked. However, it's disappointing that you don't seem to have found a clear definition or explanation of these unusual meanings of order and disorder. As PAR rightly points out, disorder is an introductory term, and for that to work it's important that beginners should be able to understand it. Indeed, the terms seem to shift meanings within your explanation and, as the quote above says, lose all real significance. This contributes to an unfortunate impression that calling entropy "disorder" leads to a termination of thought, rather than looking for the underlying explanation in terms of energy. For example, the impression is given that the appearance of order in galaxies means low entropy, which has to be offset by more "disorder" being "created" elsewhere. I make no claims to expertise, but would have thought that the formation of galaxies under the influence of gravity involved an increase in entropy. Perhaps some expert can clarify this point. It'd also be interesting to know how an organism is to decay when isolated from energy – doesn't decay usually involve, eh, living organisms? ... dave souza, talk 09:53, 14 November 2006 (UTC)
-
-
- Regarding the "Tanakhe" example, I saw this first in a thermodynamics textbook yesterday, but then lost place of where I saw it; I'll track it down. As for the new section I added, it's brand new and it's not easy to patch together a difficult topic on first pass. Let's work together to clean and clarify it. As for order and disorder in galactic-perspectives and biological evolutionary perspectives, this is a topic of current research, writing, and theorizing. New books are being published almost yearly. No one that I know of has been able to paint a crystal clear picture, yet many are working to paint small sections at a time. As for black hole entropy, I have only read one or two books on this; I think it is still a big puzzle for many. --Sadi Carnot 08:32, 15 November 2006 (UTC)
-
-
- Yes, we are getting closer to the truth here: "appearance of order" and its opposite number "appearance of disorder". Appearance. The truth is in that one word. •Jim62sch• 10:47, 14 November 2006 (UTC)
-
-
- I don't know what you mean? --Sadi Carnot 08:32, 15 November 2006 (UTC)
-
-
- Ponder. Let your intellect run unfettered. Don't rely on textbooks. Consider it a Gedenkenexperiment. •Jim62sch• 10:35, 15 November 2006 (UTC)
- Sadi, I'm glad to see that you've trimmed the section considerably. As for order and disorder in galactic-perspectives and biological evolutionary perspectives, don't you think you should have a clearer understanding of what's going on before you add things to articles? It appears that your focus on "disorder" may be leading you astray – I've just had a look at your Entropy (order and disorder) section on Adiabatic demagnetization, and you seem to have completely misunderstood the process: while Magnetic refrigeration unfortunately also throws in the odd unexplained reference to disorder, it gives a much better explanation of the process, and you may wish to amend the "order and disorder" explanation accordingly. ... dave souza, talk 10:39, 15 November 2006 (UTC)
-
- My focus and attention on "order and disorder" is a result of you and Jim62 (who have evolution/creation-issues with it) and Frank Lambert (who has teaching and understandibility isses with it). I'm sure if the three of you had your way you would delete it out of Wikipedia. Myself, I really could care less about the order-disorder entropy perspective; I just don't like to see a mis-balanced perspective written in as well as seeing comments like “disorder is dead” when the predominant view argues otherwise. As to my Adiabatic demagnetization contribution, you can discuss your issues with: Halliday, David, Resnick, Robert (1988). Fundamentals of Physics, 3rd Ed., because it is their example, and it seems perfectly clear to me. Can't you guys cut me a break? I've spent hours searching for and adding in examples, paragraphs, and diagrams just to appease your concerns. Later: --Sadi Carnot 12:03, 15 November 2006 (UTC)
-
-
-
- 1. "(who have evolution/creation-issues with it)" - that's only a teensy-tiny part of it, in fact it's the last thing on the list of problems. If that's all you've gotten out of what I've written about chaos and disorder, well...•Jim62sch• 21:53, 16 November 2006 (UTC)
- Well, Sadi, I'd hope you'll now cut other editors a break and try to work cooperatively on this article – for a start you can try to stop falsely attributing motives to other people, and try to assume good faith. It is to be hoped that we can now work together towards a clearer explanation of "disorder". As to your Adiabatic demagnetization contribution, try checking just what Halliday, David, Resnick, Robert wrote: they should surely realise that applying the magnetic field to material in the adiabatic enclosure will raise its temperature, so the subsequent removal of the magnetic field would merely bring it back to its initial temperature. While your effort is of course appreciated, please remember that you don't own the article and the important thing is to inform ordinary readers rather than enforcing your own perspective on entropy. ... dave souza, talk 16:55, 15 November 2006 (UTC)
-
-
Yes, of course I don’t own the article, but then again I’m not going to sit by and watch:
- someone delete paragraph after paragraph of months of hard work that we all have contributed to.
- someone put in views that are not in accordance with WP:NPOV per the subject.
- someone not use proper references and credible sources, e.g. textbooks, books, noted articles, etc.
- someone mis-represent historical figures in an unjust, wrong, or negative perspective.
As for the temperature lowering magnetization example, I checked it again, the description comes from a noted 3rd edition physics textbook, and from what I have read it seems to be used in practice as a technique to achieve low temperatures. If I come across another book that describes this technique, I will modify that section accordingly. Again, I only added this because you requested more examples. Thanks: --Sadi Carnot 18:18, 15 November 2006 (UTC)
- No-one's asking you to sit by: you're being asked to discuss things and work cooperatively rather than lashing out. If you've represented your book accurately, it gives a remarkably poor description: irrespective of whether or not the entropy involved is described as "disorder", when the entropy of the magnetised material is reduced its heat capacity is also reduced, and it heats up: a heat sink is needed to remove that surplus heat before it is insulated and the magnetic field relaxed for the second phase of the process. Next you'll be telling us that that a domestic fridge works by compressing and decompressing the refrigerant inside the insulated compartment! ... dave souza, talk 19:43, 15 November 2006 (UTC)
-
- Dave, don’t shoot the messenger. I’m no expert on “ultra cold temperature” physics. This is a build as we go encyclopedia; I started it as an example, possibly later I’ll find another book that clarifies further or maybe an expert will come along to add to the section. That’s all I can add presently. It looks correct to me. Of course, it could be that both myself and the textbook are wrong, and you are right. You’ve made your point and I’ll agree to keep my eyes open for other examples on this same topic. Thanks: --Sadi Carnot 04:16, 16 November 2006 (UTC)
- It's worth changing because as set out the refrigeration effect obviously contravenes basic thermodynamics, and this encyclopedia's about informing readers rather than puzzling them – if you've no objection, I'll clarify the article. .. dave souza, talk 10:31, 16 November 2006 (UTC)
- Please do -- this article needs significant clarification. And no Sadi, I'm not making any substantial edits until the Sadi Show ends. •Jim62sch• 21:56, 16 November 2006 (UTC)
- Entropy (order and disorder)#Adiabatic demagnetization clarified as requested. Sadi, it's good that you don't care about the order-disorder entropy perspective, and you'll readily appreciate that consideration of energy flow is always needed to explain what's going on. .. dave souza, talk 11:28, 17 November 2006 (UTC)
- Please do -- this article needs significant clarification. And no Sadi, I'm not making any substantial edits until the Sadi Show ends. •Jim62sch• 21:56, 16 November 2006 (UTC)
- It's worth changing because as set out the refrigeration effect obviously contravenes basic thermodynamics, and this encyclopedia's about informing readers rather than puzzling them – if you've no objection, I'll clarify the article. .. dave souza, talk 10:31, 16 November 2006 (UTC)
- Dave, don’t shoot the messenger. I’m no expert on “ultra cold temperature” physics. This is a build as we go encyclopedia; I started it as an example, possibly later I’ll find another book that clarifies further or maybe an expert will come along to add to the section. That’s all I can add presently. It looks correct to me. Of course, it could be that both myself and the textbook are wrong, and you are right. You’ve made your point and I’ll agree to keep my eyes open for other examples on this same topic. Thanks: --Sadi Carnot 04:16, 16 November 2006 (UTC)
Good work Dave. On first pass, your contribution seems to have improved that section. This seems to be a thick topic, utilizing many different theories; we’ll have to put some thought on the issue. The terms “magnetic energy” and “thermal energy” will probably need some more clarification, in terms of what these visually mean via atomic movements. I’m sure we can all put our heads together to build on this section as time goes on. Talk later: --Sadi Carnot 15:46, 18 November 2006 (UTC)
Chaos and Tanakhe
OK, let's get this clear -- this statement is wrong: The word “chaos” itself is based on the Greek word tarakhe for "disorder". Let us look at the ref: c.1440, "gaping void," from L. chaos, from Gk. khaos "abyss, that which gapes wide open, is vast and empty," from *khnwos, from PIE base *gheu-, *gh(e)i- "to gape" (cf. Gk khaino "I yawn," O.E. ginian, O.N. ginnunga-gap; see yawn). Meaning "utter confusion" (1606) is extended from theological use of chaos for "the void at the beginning of creation" in Vulgate version of Genesis. The Gk. for "disorder" was tarakhe, however the use of chaos here was rooted in Hesiod ("Theogony"), who describes khaos as the primeval emptiness of the Universe, begetter of Erebus and Nyx ("Night"), and in Ovid ("Metamorphoses"), who opposes Khaos to Kosmos, "the ordered Universe." Chaotic is from 1713. Do we all see "from PIE base *gheu-, *gh(e)i- "to gape""? Yes? That is the end of the etymology. The remainder before "yawn)." deals with cognates. In other words, chaos no more comes from tanakhe than it does from lithos. The remainder deals with the writings of antiquity and has not a damned thing to do with why it was used in the article.
Now, how, besides the ref, do I know this? Because linguistics is one of my specialties and because I can read and write Classical Greek. Now that I have explained this, I do not expect to see that κοπρος βοος reinserted into the article. •Jim62sch• 23:02, 13 November 2006 (UTC)
- Yes, thank's Jim62; yesterday I read the chaos = disorder (Greek etymology) in a thermodynamics book yesterday. I'll leave the definition out until I can find the textbook source; it will take me some time to track down where I saw it though (I'll keep looking). Thanks: --Sadi Carnot 08:23, 15 November 2006 (UTC)
-
- It's irrelevant what you find in a thermo textbook -- chaos does not come from tanakhe, period. If you find the link in a linguistics textbook, well talk. Otherwise, the issue is settled. •Jim62sch• 10:33, 15 November 2006 (UTC)
Order and Disorder
I have tagged this section for restructuring and rewriting: the section as it currently stands has little order and much disorder. The writing and organisation is chaotic, and some points do not logically follow, while others are bludgeoned into oblivion. •Jim62sch• 23:13, 13 November 2006 (UTC)
- Thanks, I'll work on cleaning it. No new article is perfect on its first day. I removed your tags (sorry) they seemed to have attracted a major case of vandalism. Let's use the talk page for cleaning issues. If you have any other suggestions or want to help clean your contributions will be welcome. --Sadi Carnot 08:20, 15 November 2006 (UTC)
Recent massive edits
Well, now what? I have no idea what changes have been made, but maybe this is a good time to restructure the article. I assume that basically what has been done is that some stuff has been thrown out and there has been a reordering. Rather than just revert the whole thing, I have done some more restructuring, but no adding or subtracting. For example, the thermodynamic definition definitely does not belong way down at the bottom. After this restructuring, we can then go to the last edit before the s**t hit the fan and start restoring stuff that was deleted that should not have been. Hows that for a plan? PAR 01:31, 15 November 2006 (UTC)
- Sorry, I hadn't checked my watchlist over the last day or so. I have no problem with a little restructuring. It seems User:Kmarinas86, a relatively new editor, seems to have deleted almost five months worth of hard work, 5kb worth of material, almost one-third of the page, and over ten references all with out one talk page comment. His contributions seem to be in the religious area? Anyway, this is a case of major vandalism. I left a warning on his talk page. If he continues in this direction, I will request administrative assistance. I will revert back to Nov 13, and than we can restructure (slowly) from there. --Sadi Carnot 08:17, 15 November 2006 (UTC)
-
-
- Sadi, this edit summary is disingenuous, "(Revert major page vandalism, deletions (over 10 refs and 6 kb of material), per WP:V by user User:Kmarinas86)"[emphasis added], and is a misuse of WP:V. I suggest you read the guidelines for proclaiming vandalism before making such bold and baseless accusations in the future. Additionally, Kmarinas86 has been around two months longer than you have. Finally, Kmarinas86's edit history is irrelevant, unless you want us to bring up your massive edits to Love, Human bonding, Interpersonal chemistry, Limerence and Love (scientific views).
- I'll be contacting User:Kmarinas86 and retracting your warning. If you wish to file on WP:AN/I feel free too, although you will be opposed. Speaking of warnings, this is a warning regarding WP:AGF. Are we clear on this issue? •Jim62sch• 10:27, 15 November 2006 (UTC)
-
-
-
-
- How can I assume good faith, when he completely "deleted" an entire two page section on order and disorder, with over 10 sources, that I had just spent hours working on? This is a section, mind you, that I was working on per yours and others requested per clarification and references. --Sadi Carnot 11:24, 15 November 2006 (UTC)
-
-
-
-
-
-
- As for my edit history, all my edits are with respect to science, the majority in thermodynamics. Human bonding, which is a suggested FA nomination article, as well as all the other articles you mention in regards to “massive edits” are articles I started and are well sourced. Religious views, as we no, have conflict with theories of evolution, the second law of thermodynamics, and entropy. This creates a conflict of interest, which in many cases results in biased edits. --Sadi Carnot 11:34, 15 November 2006 (UTC)
-
-
-
-
- Regarding restructuring, PAR's approach seems generally useful, with the minor caveat that defining the units should come early in the Definition and description of entropy. The article generally needs revision to make it intelligible to the ordinary reader, and this is a good start. Sadi's recent edits provide a suitable base to start the restructuring. As for the edits by Kmarinas86, we should assume good faith in an attempt to improve the structure and not bite the newbie, so I'll clarify that on Kmarinas86's user page. ... dave souza, talk 10:14, 15 November 2006 (UTC)
-
-
- First edit of October 2005 does not a newbie make. •Jim62sch• 10:28, 15 November 2006 (UTC)
-
-
-
-
- He is clearly a newby to the entropy page, especially for such massive deletion move. Moreover, I don't care who he is, he clearly edited in disregard to the 600+ kb worth of archived talk that went into this article. --Sadi Carnot 11:41, 15 November 2006 (UTC)
- Ah, good, Dave saved me the trouble. •Jim62sch• 10:29, 15 November 2006 (UTC)
-
-
I asked the user to explain his changes long before you appear to have noticed them. His response was not very clear. Let us continue to ask him what he was trying to do. At the same time, let nobody think that this article is even close to being a GA. Sorry I have not been around. I will be back, but I'm running a conference this weekend and I'm flat out. --Bduke 10:41, 15 November 2006 (UTC)
-
- Asking is good, accusing is bad. Agreed re GA. Good luck with the conference. •Jim62sch• 10:52, 15 November 2006 (UTC)
-
-
- This whole thing is a result of Jim62 putting two needless "attention attracting" clean-up tags on my recent "order/disorder" contribution. He quite obviously knew I started it the day before (basically owing to his and Dave Souza's request) and we know he objects to "order/disorder" in the first place. Please, I have been contributing a great deal to this page. If the page is not written as smooth as it could be, then let's work on this rather than throwing "tags" and "flags" at each other. Thank-you. --Sadi Carnot 11:51, 15 November 2006 (UTC)
-
-
- Oh wait, the actions of another are my fault? That's be funny if it weren't so idiotic. No wonder the concept of chaos and disorder appeals to you so. Has it ever occured to you that maybe the section sucked (and still has problems). You're all hung up on your FA crap, this thing doesn't even meet GA status (trust me, I had two FA's and several GA's in my first 5 months on Wiki). Also, if I wish to place tags, I'll do so. Capisce?
- BTW: do you really think you can speak for what I know in an abstract sense? He quite obviously knew I started it the day before. You have so many bloody edits it's hard to know what the hell you're doing (other than creating chaos and disorder).
- As for chaos and disorder equalling entropy -- yep, I think it's bullocks. I've explained the reasons, but you fail to comprehend them (at least that's the best guess I can make -- rather than respond you scurry about looking for more examples to support your dogmatic "understanding" of entropy.) Feh. •Jim62sch• 23:43, 15 November 2006 (UTC)
-
-
- Jim, instead of being so negative all the time, why don’t you go down to the library, check out a book on entropy, read it, and then contribute some new material to the article. How’s that for an idea? Later: --Sadi Carnot 04:24, 16 November 2006 (UTC)
-
-
- That's the point, I have read books on entropy. However, my good man, the capacity to think for one's self is more important than reading a book and, from what I can see, merely accepting what one reads -- especially when it matches one's viewpont. I too could parrot the disorder/chaos bit, but having examined it, I have found it to be wanting. I suppose that is one of the benefits of an interdisciplinary education and a desire to learn of everything in the full spectrum of human knowledge -- one is not locked into argumenta ad verecundiam. Vale. •Jim62sch• 11:15, 16 November 2006 (UTC)
-
-
- You seem to forget where you are. We are here to write up established views, made by other people, whether we agree with them or not. We are to do this using a neutral point of view. We are not here to derogate certain views. Hence, the capacity to read a book so to be able to represent those views is more important, in Wikipedia, than that of using our own views, an approach that would boarder on “original research”, an outcome that is typically the result of “thinking for one’s self”. Thank you: --Sadi Carnot 15:32, 18 November 2006 (UTC)
-
51 kb page length
The page is getting kind of long; I will move some of the "History" to History of entropy and some of the "Order and Disorder" to Entropy (order and disorder) and group up some of the 2nd law stuff with currently has two separate sections. --Sadi Carnot 08:49, 15 November 2006 (UTC)
- Done, I moved a lot of extra stuff to their own pages, gave the page a format cleaning (no deletions, other than duplicate sentences), and smoothed the page. It is now down to 37 kb and seems to give a good overview. Thanks for everyone's help. Maybe, down the road, working as a group, we can get the entropy page to FA-status. I'll request a peer-review in a few weeks. Later: --Sadi Carnot 09:47, 15 November 2006 (UTC)
-
- I took all of the related words out of "see" also (per WP:FA requirements), defined and sourced some new ones, and put them all in a new "definitions" section. The article is presently at 44 kb. Bduke and Jim62 feel the article is below GA status; possibly they could leave some feedback here so that we can improve in this direction. Thanks: --Sadi Carnot 14:03, 15 November 2006 (UTC)
Disorder vs. length of article
The two are not the same. It is strange that such an article, about disorder, defies having an orderly appearance (even when it is orderly in fact).Kmarinas86 00:44, 16 November 2006 (UTC)
Nice work PAR
Par did some good work reorganizing of the page; it feels much cleaner and the topics are grouped nicely. The main article seems to be in a relatively stable shape. If anyone has any major issues with the article or sees areas for improvement, I suggest we discuss first on the talk page. Later: --Sadi Carnot 04:45, 16 November 2006 (UTC)
- Good, I hope everyone agrees. The only problem I have is that you moved "information theory" to the bottom of the "approaches to understanding entropy" section which is basically approaches to entropy that do not meet with universal acceptance, to put it mildly. Well, I don't like disorder, I don't like energy dispersal, but information entropy is identical to the statistical mechanical entropy, in the limit of large numbers of particles, once you specify the constant of proportionality, and identify the probabilities in information theory with the probability of a microstate.
- In a more intuitively accessible statement, assuming that the number of microstates is large and finite (i.e. quantized), then
-
"The statistical mechanical entropy is proportional to the minimum number of yes/no questions you have to ask in order to determine the microstate, given that you know the macrostate.
- This is not an analogy, it's not in the least bit vague. Its the unvarnished truth. This is IMHO a strong aid to understanding entropy. PAR 06:36, 16 November 2006 (UTC)
-
- Thanks, PAR, that gives us a much clearer structure to work within. As you say, sections can be moved subject to agreement: in my opinion "information theory" follows naturally from "Microscopic viewpoint (statistical mechanics)" and should follow immediately after that section, making it clear that the distinction is that energy is not involved in information theory, so Temperature is replaced by a constant. Thus, as you say, it gives a strong aid to understanding the probability aspect of stat mech, which differs in requiring energy to make the probabilities happen. Something that probably underlies disagreements about the article but isn't really made explicit is that various disciplines have significantly different views of entropy: mechanical engineering, physics and chemistry take different approaches, each considering their understanding to be fundamental. This could be explained under "Approaches to understanding entropy". At some time in the past a decision was made to focus this article on thermodynamic entropy, but now we're getting significant mention of the other main aspect, information theory, and brief definitions of other offshoots: it makes sense to me to have this main article cover all types of entropy, at least in summary style. Though much of the focus will still be on explaining thermodynamic entropy, this would make the article more general. .. dave souza, talk 09:56, 16 November 2006 (UTC)
-
-
- Information theory should probably have its own header, not listed under “approaches to understanding entropy”. Readers need to be clearly informed that one deals with energy loss in engine cycles while the other deals with information loss in phone line signals; but also that many have attempted to find mathematical correlations between the two. Information theory, as structured over Shannon’s entropy, is almost a whole branch of science of its own. Most information theory people, economists, sociologists, etc., will argue that Shannon’s entropy and Clausius’ entropy are identical, whereas most chemists, physicists, and thermodynamicists, will argue that they have nothing to do with each other. We certainly shouldn’t argue about this point here, but than again, we shouldn’t attempt to blend the two together as though they were the same thing. Entire papers and chapters have been written about this argument, e.g. 1999 Entropy Article --Sadi Carnot 16:16, 18 November 2006 (UTC)
-
-
-
-
- As I said before, this section should really be entitled "favorite theories under dispute" or something. Boltzmann's statistical mechanical entropy is, in fact, a particular case of Shannon entropy. The two are related by:
-
-
-
-
-
- where k is Boltzmann's constant. This is a simple, indisputable, mathematical fact. That it is in fact under dispute here is, I assume, because no one has taken the time to investigate the subject. Therefore it deserves, at the very least, a subsection in "favorite theories under dispute" until such time as people come to realize that it is indisputable, at which point it can be moved up into the "definition of Entropy" section. I would oppose any downgrading of the idea, giving it less significance than the "energy dispersal" or "order/disorder" approaches. PAR 17:10, 18 November 2006 (UTC)
-
-
Yes, I agree that it is a mathematical fact that the equations are near to the same and I am not suggesting a downgrade, just that the header should be something like: "Entropy as information" or something; as to ordering, I would do it chronologically: Carnot's pre-entropy (1824), Clausius's entropy (1862), Boltzmann's entropy (1870), Gibbs' entropy (1903), Shannon's entropy (1949), Atkin's entropy (1984), etc., Later: --Sadi Carnot 17:52, 18 November 2006 (UTC)
- To be clear, my opinion is that information entropy is a significant aspect of entropy that should have its own sub-section under Definition and description of entropy where it is explained in relation to statistical entropy: which comes first is a matter of which gives the clearest explanation to the ordinary reader, though my immediate inclination is to have stat. mech. first as following on from the macroscopic viewpoint. To me, Approaches to understanding entropy is about descriptions introducing people to the concepts set out in the previous section. There does seem to be a need for a description of chemical entropy in the Definition etc. section, and a subsection on that would be welcome. However the historical sequence which is appropriate for the history article shouldn't get in the way of a logical sequence for explaining the various concepts. ... dave souza, talk 22:35, 18 November 2006 (UTC)
-
-
- Dave, as to your comment “my opinion is that information entropy is a significant aspect of entropy”; this is not an established fact. Yes, many do argue in this direction, but more argue in opposition to this direction:
- It's not a good idea to tip-toe around this issue as though it weren't there. You will have to find some strong supporting references for that statement if you wish to assume they are the same and that the world argrees with this. Later: --Sadi Carnot 01:44, 21 November 2006 (UTC)
-
-
- What is "chemical entropy"? Regarding the order thermo/stat mech order, I somewhat agree - a complete and detailed description of thermodynamic (Clausius) entropy should not be made without the stat-mech (Boltzmann) viewpoint to assist in the understanding. Either stat mech first, or maybe thermo (simple), then stat mech (simple) then the union of both. PAR 23:33, 18 November 2006 (UTC)
-
-
- My poor phrasing, should have referred to entropy in chemistry. The article covers heat transfer, but gives little or no indication of the significance of thermodynamic entropy in chemical reactions and standard entropies for materials, a subject which at least should be outlined. The order thermo (simple), stat mech (simple) then information entropy seems appropriate, and the info entropy section could open by stating that the Boltzmann viewpoint is a particular case of Shannon entropy, using your description above, then explaining the basics, with the definitions listed at present being incorporated into the explanation. .. dave souza, talk 17:37, 19 November 2006 (UTC)
-
-
-
-
- Ok, I see what you're saying, and I agree. A short section on entropy as it is involved in chemical reactions would go in the "description and definition" section, since it is important and not a disputed concept. It should be short, however and point to a "main" article like chemical equilibrium or something. I also like the thermo-statmech-both idea also, because the thermodynamic description is the real definition of entropy, while the statistical mechanical description is an explanation of entropy. By that I mean, if you want to measure entropy, you measure heat and divide by temperature, you don't measure the number of availiable microstates. PAR 18:46, 19 November 2006 (UTC)
-
-
Yes, good idea as to “entropy in chemistry”; the see main header, however, would be chemical thermodynamics. Using Brown, LeMay, and Bursten’s Chemistry – the Central Science (9th Ed) as a reference, for example, ch. 5 (Thermochemistry) is about enthalpy change and Hess’s law, ch. 15 (Chemical Equilibrium) is about the rate constant, and ch. 19 (Chemical Thermodynamics) is where the entropy stuff is. It has the following outlined sections:
- 19.2: “Entropy and the Second Law of Thermodynamics”
- 19.3: “The Molecular Interpretation of Entropy”
- 19.4: “Entropy Changes in Chemical Reactions”
- 19.5: “Gibbs Free Energy”
- 19.6: “Free Energy and Temperature”
- 19.7: “Free Energy and the Equilibrium Constant”
I only recently started the chemical thermodynamics article, however; it still needs a lot of work. Later: --Sadi Carnot 01:34, 21 November 2006 (UTC)
Results of Automated Peer Review
Entropy
The following suggestions were generated by a semi-automatic javascript program, and might not be applicable for the article in question.
- The lead of this article may be too long, or may contain too many paragraphs. Please follow guidelines at WP:LEAD; be aware that the lead should adequately summarize the article.
- The lead is for summarizing the rest of the article, and should not introduce new topics not discussed in the rest of the article, as per WP:LEAD. Please ensure that the lead adequately summarizes the article.
- Per WP:MOS, avoid using words/phrases that indicate time periods relative to the current day. For example, recently might be terms that should be replaced with specific dates/times.
- Per WP:MOS#Headings, headings generally do not start with the word 'The'. For example, ==The Biography== would be changed to ==Biography==.
- Per WP:MOS#Headings, headings generally should not repeat the title of the article. For example, if the article was Ferdinand Magellan, instead of using the heading ==Magellan's journey==, use ==Journey==.
Generally, trivia sections are looked down upon; please either remove the trivia section or incorporate any important facts into the rest of the article.
- Please alphabetize the interlanguage links.
- Per WP:WIAFA, this article's table of contents (ToC) may be too long- consider shrinking it down by merging short sections or using a proper system of daughter pages as per WP:SS.
- There are a few occurrences of weasel words in this article- please observe WP:AWT. Certain phrases should specify exactly who supports, considers, believes, etc., such a view.
- it has been
- might be weasel words, and should be provided with proper citations (if they already do, or are not weasel terms, please
strikethis comment).
- Please make the spelling of English words consistent with either American or British spelling, depending upon the subject of the article. Examples include: meter (A) (British: metre), organize (A) (British: organise), realize (A) (British: realise), ization (A) (British: isation), isation (B) (American: ization).
- Watch for redundancies that make the article too wordy instead of being crisp and concise. (You may wish to try Tony1's redundancy exercises.)
- Vague terms of size often are unnecessary and redundant - “some”, “a variety/number/majority of”, “several”, “a few”, “many”, “any”, and “all”. For example, “
Allpigs are pink, so we thought ofa number ofways to turn them green.”
- Vague terms of size often are unnecessary and redundant - “some”, “a variety/number/majority of”, “several”, “a few”, “many”, “any”, and “all”. For example, “
- Please ensure that the article has gone through a thorough copyediting so that it exemplifies some of Wikipedia's best work. See also User:Tony1/How to satisfy Criterion 1a.
You may wish to browse through User:AndyZ/Suggestions for further ideas. Thanks, Kmarinas86 06:31, 16 November 2006 (UTC)
-
- Note: the link in this sub-heading just leads to this Entropy article. ..dave souza, talk 10:12, 16 November 2006 (UTC) rethought 10:26, 16 November 2006 (UTC)
Implementation
Thanks for getting that, it gives useful pointers. The lead clearly needs to be rethought, and I'll try to consider that: to some extent it will follow from the structure and hence the contents of the article. The article's recently acquired lists of brief definitions without explanations: these should be incorporated into explanatory text, or moved to Entropy (disambiguation). There's also rather a tendency to throw in obsolescent terms – these are appropriate in the History of entropy article, but tend to make the main article more confusing, which we can do without. .. dave souza, talk 10:12, 16 November 2006 (UTC)
-
- What obsolete terms are you referring to specifically? As to moving the definitions I collected to disambig, I would prefer to see them made into paragraphs. Disambig pages are for unrelated topics that have the same common name. The definitions I added are all related to Clausius’ entropy in some connected way or another. --Sadi Carnot 16:27, 18 November 2006 (UTC)
Adiabatic demag
The more I ponder the concept, the more wrong it appears to be. I note that in the the 7th edition (2005), there is no mention of adiabatic demagnetization. In what chapter/topic did it appear in the 3rd edition (1988) that you used? The presentation you give on 'order/disorder' of 'atomic order-disorder' is simplistic to the point of meaninglessness. The five step diagram in the wiki article "Magnetic Refrigeration" is far superior and much more essential for gaining an understanding of the process.
- .." In the first step of magnetization, the external magnet aligns the spins of the 'atomic magnets' in a system of inorganic crystals that have been cooled to the temperature of liquid helium. This magnetic alignment greatly increases the gap between spin energy levels. Because this means that there fewer accessible energy levels, the original energy cannot be as dispersed s it was -- and the excess undispersed spin energy is transferred to lattice energy. Then, because lattice energy determines the temperature of the crystals, this increase in lattice energy causes the crystal temperature to rise. After the external cooling bath of liquid helium cools the warmer magnetiized system to liquid He temperature (about 1K,
the system is insulated and the external magnetic field rapidly decreased. However, now the spin levels are returned to their original smaller-gap levels and there are many un- or relatively sparsely-occupied spin levels and energy is transferred from the lattice energy of the crystal to the energy-deficient atomic spins to restore their energetic equilibrium, Consequently, the decrease in lattice energy means that the crystals drop below the temperature at which they were (i.e., below liquid He temperature)."
Thus, in no sense is the appearance order and disorder causal in this process (nor are they ever causal re entropy, they are mere symptoms.) Energy transfer/T, (disperal, or if you prefer "energy flow") as in all entropy changes, is the proper focus, as it has been since Clausius! •Jim62sch• 10:29, 17 November 2006 (UTC)
- See: ch. 22, pgs: 525-26. Later: --Sadi Carnot 16:22, 18 November 2006 (UTC)
-
- Which edition? •Jim62sch• 22:45, 18 November 2006 (UTC)
- 3rd Ed. Later: --Sadi Carnot 01:16, 21 November 2006 (UTC)
- Ah, yes, I see. Just to verify, you were refering to pp. 525-526 in the 3rd edition of Halliday and Resnick, yes? I suppose that is why I felt the description you offer of adiabatic demagnetization was rather weak. Those pages are baby steps to entropy itself, and have in fact been removed from the most recent editions. Myself, I prefer my specific and detailed analysis based on Kittel and Kroemer, as I prefer the entree to the appetizer. ;) •Jim62sch• 22:13, 22 November 2006 (UTC)
- Which edition? •Jim62sch• 22:45, 18 November 2006 (UTC)
-
Lead
- see entropy query that started this.
This really needs simplifying to make it understanable by the average person. I am a degree qualified (electronics) engineer and I can't understand it! So what chance does Joe Public have?--Light current 02:24, 21 November 2006 (UTC)
- Well, I am a degree qualified (electrical) engineer and I can understand it! Moreover, as a group we have been working on trying to improve the lead towards that of the "novice" perspective over the last several months. Check the archives. This is a big issue. It is difficult not to compromise the integrity of the article for the sake of Joe Public by dumbing it down so much that it isn't even credible any more. Please, per the requirements of WP:Featured Article and WP:Lead, the "lead" needs to be at least 3 paragraphs. I will attach a link to Introduction to entropy which is where you may want to take your concerns. Thank you: --Sadi Carnot 02:45, 21 November 2006 (UTC)
-
- When it comes to improving quality and understandability of any article, the rule is WP:IAR. I suggest the only reason you understand it is that you have been working on it for 3 months. Most readers dont have that sort of time! 8-|--Light current 06:04, 21 November 2006 (UTC)
-
-
- If you have suggestions or ideas on how to facilitate the understanding of entropy without making up original research or conceptions, please feel free. Cutting the intro down to one paragraph, however, doesn't help. Thanks: --Sadi Carnot 13:09, 21 November 2006 (UTC)
-
Merge of Introduction to entropy
I have changed the merge tags to mergeto and mergefrom which is the best way and this brings the discussion to one place - here. I am not sure I agree with the merge but I think it needs to be discussed and it needs to be discussed carefully and slowly. Let us not rush into this. I have been busy with a conference but I plan to spend some time now thinking and working on this pair of articles. The first paragraph really does still need work. I'll make some suggestions soon. --Bduke 06:32, 21 November 2006 (UTC)
- Thanks for doing that. 8-)--Light current 06:42, 21 November 2006 (UTC)
-
- Thanks for picking this up. I've been doing some thinking about the introduction to both articles, and have now drafted my ideas as a new introduction to Introduction to entropy with the previous introduction being moved unchanged to an "Overview" section. I've also included a simpler version of the glass of water example, making it clear that the ice and water has been allowed to reach equilibrium, and have cut back the "disorder / dispersal" stuff which duplicates the articles now linked. Please treat these ideas as a possible starting point: it's far from being a finished article.
-
- As for merging, there's a desperate need for the main Entropy article to be clarified so that it can be understood by ordinary readers, and the introduction article can provide ideas for such improvement. However, as a separate article it allows explanations to be given at greater length than might be appropriate in a main article, and so I think there's a case for keeping it for that purpose. .. dave souza, talk 11:58, 21 November 2006 (UTC)
-
-
- We already voted not to merge these pages last month; plus the main entropy page is over-the-limit in file size, it's pushing near to 60 kilobytes. The introduction to difficult topics is part of a series:
-
These simple "introduction" pages are not my idea; but, then again, I'm not going to go to all of these pages and request mergers. I suggest that we use them accordingly now that people are making them (for whatever reason). Thanks, I will remove the merge tags. --Sadi Carnot 13:21, 21 November 2006 (UTC)
Thermodynamic entropy vs. information entropy
Hi all, I plan to start this header. I keep finding these posts by random users writing in and asking if they are the same or if they are different. This issue needs to be addressed in an objective neutral point of view. Below is an example debate on this issue between four physicists:
Moreover, here is an informative article (pasted section) from that page (sourced from here) that exemplifies the confusion and the distinction between the two:
- Thanks for showing interest but still it isn´t clear enough for me. The fact that the equations are basically the same (at least in appearance) does not convince me enough yet to say that both terms are indeed the same one. Read this:
- "Shannon's information entropy function has exactly the same form as the equation of Boltzmann's H-Theorem:
-
- H(t) = ∫ f ln f dc
- where ∫ is the integral symbol from calculus, ln means natural (base e) logarithm, f is the distribution function for molecules in an ideal gas, and c is the velocity space. The symbol H is used in Information Theory because of this similarity.
- Interestingly, Brownian motion (the random thermal motion of molecules) is also a Markov process. It is from the H(t) formula that we can derive:
-
- S = k ln w
- where S is the thermodynamic entropy of a system, k is Boltzmann's constant, and w is the disorder of the system; that is, the probability that a system will exist in the state it is in relative to all the possible states it could be in. Boltzmann's H-Theorem tells us that, after a long time, f will reach equilibrium. This is similar to what Shannon tells us about information sources modeled as ergodic processes. Despite the similarities, Shannon entropy and thermodynamic entropy are not the same. Thermodynamic entropy characterizes a statistical ensemble of molecular states, while Shannon entropy characterizes a statistical ensemble of messages.
- In thermodynamics, entropy has to do with all the ways the molecules or particles might be arranged, and greater entropy means less physical work can be extracted from the system. In Shannon’s usage, entropy has to do with all the ways messages might be transmitted by an information source, and greater entropy means the messages are more equally probable. Entropy in information theory does not mean information is becoming more useless or degraded; and because it is a mathematical abstraction, it does not directly relate to physical work unless you are treating molecules informatically.
- Shannon entropy has been related by physicist Léon Brillouin to a concept sometimes called negentropy. This is a term introduced by physicist and Nobel laureate Erwin Schrödinger in his 1944 text What is Life to explain how living systems export entropy to their environment while maintaining themselves at low entropy; in other words, it is the negative of entropy. In his 1962 book Science and Information Theory, Brillouin described the Negentropy Principle of Information or NPI, the gist of which is that acquiring information about a system’s microstates is associated with a decrease in entropy (work is needed to extract information, erasure leads to increase in thermodynamic entropy). There is no violation of the Second Law of Thermodynamics involved, since a reduction in any local system’s thermodynamic entropy results in an increase in thermodynamic entropy elsewhere.
- The relationship between the Shannon's information entropy H and the entropy S from statistical mechanics was given more rigor by Edwin Jaynes in 1957. The upshot is that information entropy and thermodynamic entropy are closely related metrics, but are not the same metric. For most practioners of Information Theory up to now, this poses no difficulty, because their field is communication and computation using conventional electronic circuits where the thermodynamic meaning of entropy is not discussed. The conflicting terminology results in much confusion, however, in areas like molecular machines and physics of computation, where information and thermodynamic entropy are dealt with side by side. Some authors, like Tom Schneider, argue for dropping the word entropy for the H function of Information Theory and using Shannon's other term uncertainty (average surprisal) instead. For more on this see Information Is Not Entropy, Information Is Not Uncertainty!
- Unlike molecular entropy, Shannon entropy can be locally reduced without putting energy into the information system. Simply passing a channel through a passive filter can reduce the entropy of the transmitted information (unbeknownst to the transmitter, the channel capacity is reduced, and therefore so is the entropy of the information on the channel). The amount of power needed to transmit is the same whether or not the filter is in place, and whether or not the information entropy is reduced. Another way to think about this is to cut one wire of a channel having multiple parallel wires. The average information going across the channel, the entropy, goes down, with no relationship to the amount of energy needed to cut the wire. Or shut off the power supply to an information source and watch its output fix on one single symbol “off” with probability 1 and information entropy 0."
I will group this section below the Statistical Thermodynamics section, if anyone wants to re-place this somewhere else, I would have no objective; hopefully we can all chip in. More talk-page examples are below:
-
- Person who states they are the same
- Person who states they are not the same
- Argument about information entropy units
- Argument about the utility of information entropy in physics
- Argument about the separate distinction between Shannon's entropy and thermodynamic entropy
- Discussion on the distinction between the two
- More debate on the distinction between the two
- Page contrasting the two
- Entropy-themed story that mixes it all together
- BBC article that assumes they are the same
It's not a good idea to tip-toe around this issue as though it weren't there. The distinction between these two is a major issue that we need to address. I’ve linked to two talk pages now and one article where people have been confused about this and have written in trying to find the answer. I suggest that we provide this answer here, very clearly. Thanks: --Sadi Carnot 13:42, 21 November 2006 (UTC)
- May I add this paper to the reading list:
- R. Balian, Entropy, a Protean Concept, Prog. Math. Phys. 38, 119-144 (2004), Séminaire Poincaré 2, 13-27 (2003) online
- IMHO its rather insightfull and especially fig. 2 comparing the "different entropies" is nice.
- Pjacobi 14:42, 21 November 2006 (UTC)
-
- Good work, I'll ref that in the new paragraph I'm going to add. --Sadi Carnot 15:03, 21 November 2006 (UTC)
-
-
- I have moved the "info theory entropy" subsection back into "approaches" section. The first section should contain non-controversial descriptions and definitions, the second section will be for the contentious stuff. Also, I expanded the "pro" viewpoint in the info-theory section. For those who are leery of the information theory approach, please read it carefully, I have really tried hard to explain the link between the two. - Thanks PAR 18:18, 21 November 2006 (UTC)
-