Talk:Second law of thermodynamics

From Wikipedia, the free encyclopedia

WikiProject Physics This article is within the scope of WikiProject Physics, which collaborates on articles related to physics.
B This article has been rated as B-Class on the assessment scale.
Top This article is on a subject of top importance within physics.

Help with this template

WikiProject Chemistry This article is within the scope of WikiProject Chemistry, which collaborates on Chemistry and related subjects on Wikipedia. To participate, help improve this article or visit the project page for details on the project.
B This article has been rated as B-Class on the quality scale.
Top This article has been rated as Top-importance on the importance scale.

Article Grading: The following comments were left by the quality and importance raters: (edit · refresh)


Contents

[edit] Archives

[edit] Cut sloppy definitions

I cut the following ("consumed" was a word used back in the 17th century):

  • If thermodynamic work is to be done at a finite rate, free energy must be consumed [1]
  • The entropy of a closed system will not decrease for any sustained period of time (see Maxwell's demon)
A downside to this last description is that it requires an understanding of the concept of entropy. There are, however, consequences of the second law that are understandable without a full understanding of entropy. These are described in the first section below.

For such an important & confusing law in science, we need textbook definitions with sources.--Sadi Carnot 04:58, 3 March 2006 (UTC)

I don't understand why these two were selected for removal and no textbook definitions with sources were added. One way to make things clear to the general reader is to limit the way the law is expressed to a very strict definition used in modern physics textbooks. A second way is to express the law multiple times so that maybe one or two of them will "stick" in the mind of a reader with a certain background. I think the second approach is better. The word "consumed" was used and referenced to an article from 2001 about biochemical thermodynamics by someone at a department of surgery. When the second law is applied to metabolism, the word "consumed" seems to me to be current and appropriate. Flying Jazz 12:49, 3 March 2006 (UTC)

I don't understand why the entropy definition was deleted. It's not wrong or confusing. -- infinity0 16:17, 3 March 2006 (UTC)

Regarding the first statement, in 1824 Sadi Carnot, the originator of the 2nd Law mind you, states: “The production of motive power is then due in steam-engines not to an actual consumption of the caloric, but to its transportation from a warm body to a cold body, that is to its re-establishment of equilibrium.” Furthermore, for the main statements of the 2nd Law we should be referencing someone from a department of engineering not someone from the department of surgery; certainly this reference is good for discussion, but not as principle source for one of the grandest laws in the history of science. Regarding the second statement, to within an approximation, the earth delineated at the Karman line can be modeled as a closed system, i.e. heat flows across the boundary daily in a significant manner, but matter, aside for the occasional asteroid input and negligible atmospheric mass loss, does not. Hence, entropy has decreased significantly over time. This model contradicts the above 2nd Law statement. Regarding the layperson viewpoint, certainly it is advisable to re-word things to make them stick, but when it comes to laws precise wording is a scientific imperative.--Sadi Carnot 18:37, 3 April 2006 (UTC)

[edit] New "definition"

User talk:24.93.101.70 keeps inserting this in:

  • If thermodynamic work is to be done at a finite rate, free energy must be consumed. (This statement is based on the fact that free energy can be accurately defined as that portion of any First-Law energy that is available for doing thermodynamic work; i.e., work mediated by thermal energy. Since free energy is subject to irreversible loss in the course of thermodynamic work and First-Law energy is not, it is evident that free energy is an expendable, Second-Law kind of energy.)

This definition is neither succinct nor clear. Also, it's very confusing and very very long winded, and it makes no sense (to me, and therefore the reader too). "First-Law" energy? What's that?? -- infinity0 22:41, 21 March 2006 (UTC)

Agree, replaced the word "consumed" accordingly; 1st Law = "Energy is Conserved" (not consumed).--Sadi Carnot 19:12, 3 April 2006 (UTC)

[edit] Mathematical definition

I tweaked the mathematical definition a little, which probably isn't the best way to do it. However I was thinking that a much more accurate mathematical description would be:

"As the time over which the average is taken goes to ∞:

Average\bigg(\frac{dS}{dt}\bigg) \ge 0

where

 \frac{dS}{dt} is the instantanious change in entropy per time"

Comments? Fresheneesz 10:43, 4 April 2006 (UTC)

I feel this is a worth while point; however, it should go in a separate "header" section (such as: Hypothetical violations of 2nd Law) and there should be reference to such articles as: 2nd Law Beads of Doubt. That is, the main section should be clear and strong. Esoteric deviations should be discussed afterwards. --Sadi Carnot 15:54, 4 April 2006 (UTC)
If you correctly state the second law such that the entropy increases on average, then there are no violations of the 2nd law. (The terminology used in your reference is dodgy.) That the entropy can sometimes go down is no longer hypothetical, since transient entropy reductions have been observed. Really there should be links to the Fluctuation theorem and Jarzynski equality, since these are, essentially, new and improved, cutting edge, extensions of the second law. Nonsuch 19:18, 4 April 2006 (UTC)
The idea that the entropy increases on the average (Not all of the time) is very old, and should be included. It is not necessary to average over infinite time. On average, the entropy increases with time, for any time interval. The reason that this point is normally glossed over is that for a macroscopic system we can generally ignore the differences between average and instantaneous values of energy and entropy. But these days we can do experimental thermodynamics on microsopic systems. Nonsuch 19:18, 4 April 2006 (UTC)

[edit] See also "Treatise with Reasoning Proof of the Second Law of Energy Degradation"

"Treatise with Reasoning Proof of the Second Law of Energy Degradation: The Carnot Cycle Proof, and Entropy Transfer and Generation," by Milivoje M. Kostic, Northern Illinois University http://www.kostic.niu.edu/Kostic-2nd-Law-Proof.pdf and http://www.kostic.niu.edu/energy

“It is crystal-clear (to me) that all confusions related to the far-reaching fundamental Laws of Thermodynamics, and especially the Second Law, are due to the lack of their genuine and subtle comprehension.” (Kostic, 2006).

There are many statements of the Second Law which in essence describe the same natural phenomena about the spontaneous direction of all natural processes towards a stable equilibrium with randomized redistribution and equi-partition of energy within the elementary structure of all interacting systems (thus the universe). Therefore, the Second Law could be expressed in many forms reflecting impossibility of creating or increasing non-equilibrium and thus work potential between the systems within an isolated enclosure or the universe:

1. No heat transfer from low to high temperature of no-work process (like isochoric thermo-mechanical process).

2. No work transfer from low to high pressure of no-heat process (adiabatic thermo-mechanical process).

3. No work-producing from a single heat reservoir, i.e., no more efficient work-producing heat engine cycle than the Carnot cycle.

4. Etc, etc … No creation or increase of non-equilibrium and thus work potential, but only decrease of work potential and non-equilibrium towards a common equilibrium (equalization of all energy-potentials) accompanied with entropy generation due to loss of work potential at system absolute temperature, resulting in maximum equilibrium entropy.

All the Second Law statements are equivalent since they reflect equality of work potential between all system states reached by any and all reversible processes (reversibility is measure of equivalency) and impossibility of creating or increasing systems non-equilibrium and work potential.

About Carnot Cycle: For given heat reservoirs’ temperatures, no other heat engine could be more efficient than a (any) reversible cycle, since otherwise such reversible cycle could be reversed and coupled with that higher efficiency cycle to produce work permanently (create non-equilibrium) from a single low-temperature reservoir in equilibrium (with no net-heat-transfer to the high-temperature reservoir). This implies that all reversible cycles, regardless of the cycle medium, must have the same efficiency which is also the maximum possible, and that irreversible cycles may and do have smaller, down to zero (no net-work) or even negative efficiency (consuming work, thus no longer power cycle). Carnot reasoning opened the way to generalization of reversibility and energy process equivalency, definition of absolute thermodynamic temperature and a new thermodynamic material property “entropy,” as well as the Gibbs free energy, one of the most important thermodynamic functions for the characterization of electro-chemical systems and their equilibriums, thus resulting in formulation of the universal and far-reaching Second Law of Thermodynamics. It is reasoned and proven here that the net-cycle-work is due to the net-thermal expansion-compression, thus necessity for the thermal cooling and compression, since the net-mechanical expansion-compression is zero for any reversible cycle exposed to a single thermal reservoir only.

In conclusion, it is only possible to produce work during energy exchange between systems in non-equilibrium. Actually, the work potential (maximum possible work to be extracted in any reversible process from that systems' non-equilibrium to the common equilibrium) is measure of the systems’ non-equilibrium, thus the work potential could be conserved only in processes if the non-equilibrium is preserved (conserved, i.e. rearranged), and such ideal processes could be reversed. When the systems come to the equilibrium there is no potential for any process to produce (extract) work. Therefore, it is impossible to produce work from a single thermal reservoir in equilibrium (then non-equilibrium will be spontaneously created leading to a “black-hole-like energy singularity,” instead to the equilibrium with randomized equi-partition of energy). It is only possible to produce work from thermal energy in a process between two thermal reservoirs in non-equilibrium (with different temperatures). Maximum work for a given heat transfer from high to low temperature thermal reservoir will be produced during ideal, reversible cyclic process, in order to prevent any other impact to the surrounding, (like net-volume expansion, etc.; net-cyclic change is zero). All real natural processes between systems in non-equilibrium have tendency towards common equilibrium and thus loss of the original work potential, by converting other energy forms into the thermal energy accompanied with increase of entropy (randomized equi-partition of energy per absolute temperature level). Due to loss of work potential in a real process, the resulting reduced work cannot reverse back the process to the original non-equilibrium, as is possible with ideal reversible processes. Since non-equilibrium cannot be created or increased spontaneously (by itself and without interaction with the rest of the surroundings) then all reversible processes must be the most and equally efficient (will equally conserve work potential, otherwise will create non-equilibrium by coupling with differently efficient reversible processes). The irreversible processes will loose work potential to thermal energy with increase of entropy, thus will be less efficient than corresponding reversible processes … this will be further elaborated and generalized in other Sections of this “Treatise with Reasoning Proof of the Second Law of Energy Degradation,” -- the work is in final stage to be finished soon.

http://www.kostic.niu.edu and http://www.kostic.niu.edu/energy
Kostic, what is the point of all this verbiage? Why is it at the top of the talk page? Why does it make no sense? Nonsuch 04:20, 12 April 2006 (UTC)

[edit] Miscellany - Cables in a box

This description of the 2nd Law was quoted in a slightly mocking letter in the UK The Guardian newspaper on Wednesday 3rd May 2006. It has been in the article unchanged since 12 September 2004 and at best needs editing if not deleting. To me it adds little in the way of clarity. Malcolma 11:50, 4 May 2006 (UTC)

Deleted. MrArt 05:19, 14 November 2006 (UTC)

[edit] Why not a simple statement ?

I would propose to add the following statement of the 2nd law to the "general description" section, or even the intro : "Heat cannot of itself pass from a colder to a hotter body. " (Clausius, 1850). Unlike all others, this statement of the law is understandable by everybody: any reason why it's not in the article ? Pcarbonn 20:22, 6 June 2006 (UTC)

Excellent idea! LeBofSportif 20:44, 6 June 2006 (UTC

[edit] Order in open versus closed systems

Propose adding the following section:


Order in open versus closed systems Granville Sewell has developed the equations for entropy change in open versus closed systems.[1] [2] He shows. e.g., St >= - integral of heat flux vector J through the boundary.

In summary:

  1. Order cannot increase in a closed system.
  2. In an open system, order cannot increase faster than it is imported through the boundary.

Sewell observes:

The thermal order in an open system can decrease in two ways -- it can be converted to disorder, or it can be exported through the boundary. It can increase in only one way: by importation through the boundary.[3]

DLH 03:41, 13 July 2006 (UTC)

Added link to Sewell's appendix D posted on his site. Sewell, Granville (2005) Can "ANYTHING" Happen in an Open System? And his key equation D.5 ( ie St >= - integral of heat flux vector J through the boundary.) This is a key formulation that is not shown so far in the Wiki article. I will add the explicit equations especially D.4 and D.5 DLH 14:05, 13 July 2006 (UTC)


Notable? FeloniousMonk 05:54, 13 July 2006 (UTC)
Not highly, I should say, but he most certainly has a clear bias: idthefuture.
Besides, I'm not sure what the relevance of talking about a closed system is as it's inapplicable to Earth; and the open system quotation sounds like words in search of an idea.
In any case, this is yet another example of a creationist trying to manipulate 2LOT to support his conclusion, generally by misrepresenting both 2LOT and evolution. To wit:
  • "The evolutionist, therefore, cannot avoid the question of probability by saying that anything can happen in an open system, he is finally forced to argue that it only seems extremely improbable, but really isn’t, that atoms would rearrange themselves into spaceships and computers and TV sets."page 4
•Jim62sch• 09:35, 13 July 2006 (UTC)
Please address Sewell's derivation of the heat flux through the boundary, equations D.1-D.5, as I believe they still apply.See: Sewell, Granville (2005) Can "ANYTHING" Happen in an Open System?DLH 14:46, 13 July 2006 (UTC)
A mainstream way of treating DLH's point about open systems (which perhaps should be reviewed in the article) is that
  • in any process the entropy of the {system + its surroundings} must increase.
Entropy of the system can decrease, but only if that amount of entropy or more is exported into the surroundings. For example, think of a lump of ice growing in freezing-cold water. The entropy of the water molecules freezing in to the lump of ice fall; but this is more than offset by the energy released by the process (the latent heat), warming up the remaining liquid water. Such reactions, with a favourable ΔH, unfavourable ΔSsystem, but overall favourable ΔG, are very common in chemistry.
I'm afraid Sewell's article is a bit bonkers. There's nothing in the second law against people doing useful work (or creating computers etc), "paid for" by a degradation of energy. Similarly there's nothing in the second law against cells doing useful work (building structures, or establishing ion concentration gradients etc), "paid for" by a degradation of energy.
Of course, Sewell has a creationist axe to grind. Boiled down, his argument is not really about the Second Law, by which all the above processes are possible. Instead, what I think he's really trying to drive towards is the notion that we can distinguish the idea of 'a process taking advantage of the possibilities of such energy degradation in an organised way to perpetuate its own process' as a key indicator of "life"; and then ask the question, if we understand "life" in such terms, is a transition from "no life" to "life" a discontinuous qualitative step-change, which cannot be explained by a gradualist process of evolution?
This is an interesting question to think about; but the appropriate place to analyse it is really under abiogenesis. It doesn't have a lot to do with the second law of thermodynamics. For myself, I think the creationists are wrong. I think we can imagine simple boundaries coming into being naturally, creating a separation between 'system' and 'surroundings' - for example, perhaps in semi-permeable clays, or across a simple lipid boundary; and then I think we can imagine there could be quite simple chemical positive feedback mechanisms that could make use of eg external temperature gradients to reinforce that boundary; and the whole process of gradually increasing complexity could pull itself up from its own bootstraps from there.
But that's not really an argument for here. The only relevant thing as regards the 2nd law is that the 2nd law does not forbid increasing order or structure or complexity in part of a system, so long as it is associated with an increase in overall entropy. -- Jheald 10:05, 13 July 2006 (UTC).



  • Putting aside whatever ad hominem by association arguments you have against Sewell, please address the proposed statement summarizing Sewell's formulation of the entropy vs flux through the boundary, and the links provided. DLH 14:05, 13 July 2006 (UTC)DLH 14:23, 13 July 2006 (UTC)

[edit] Scientifically Based

Removed religious comments in article, please let us keep these articles scientifically based, I have never opened a textbook and come across those lines in any section on the second law. Physical Chemist 19:26, 18 July 2006 (UTC)


[edit] References

  1. ^ Sewell, Granville (2005). The numerical Solution of Ordinary and Partial Differential Equations, 2nd Edition,. ISBN 0471735809.  Appendix D.
  2. ^ Sewell, Granville (2005) Can "ANYTHING" Happen in an Open System?
  3. ^ Sewell, Granville (2005) A Second Look at the Second Law

[edit] maximize energy degradation ?

The article currently says: " [The system tends to evolve towards a structure ...] which very nearly maximise the rate of energy degradation (the rate of entropy production". I've added a request for source for this, and it would be interesting to expand this statement further (in this or other related articles; I could not find any description of this statement in wikipedia). Pcarbonn 11:16, 25 July 2006 (UTC)

Prigogine and co-workers first proposed the Principle of Maximum Entropy Production in the 1940s and 50s, for near-equilibrium steady-state dissipative systems. This may or may not extend cleanly to more general further-from-equilibrium systems. Cosma Shalizi quotes P.W. Anderson (and others) being quite negative about it [2]. On the other hand, the Lorenz and Dewar papers cited at the end of Maximum entropy thermodynamics seem much more positive, both about how the principle can be rigorously defined and derived, and about how it does appear to explain some real quantitative observations. Christian Maes has also done some good work in this area, which (ISTR) says it does pretty well work for the right definition of "entropy production", but there may be some quite subtle bodies buried.
A proper response to you probably Needs An Expert; I'm afraid my knowledge is fairly narrow in this area, and not very deep. But hope the paragraph above gives a start, all the same. Jheald 12:23, 25 July 2006 (UTC).
Note also, just to complicate things,that in effect there's a minimax going on, so Maximum Entropy Production with respect to one set of variables can also be Minimum Entropy Production (as discussed at the end of the Dewar 2005 paper, a bit mathematically). It all depends what variables you're imagining holding constant, and what ones can vary (and what Legendre transformtions you've done). In fact it was always "Minimum" Entropy Production that Prigogine talked about; but re-casting that in MaxEnt terms is probably a more general way of thinking about what's going on. Jheald 13:09, 25 July 2006 (UTC).

[edit] Air conditioning?

I asked this question on the AC page, but didn't get an answer, but how does air conditioning NOT violate the 2nd law? Isn't it taking heat from a warm enviroment (inside of a building), and transfering it to a hotter enviroment (outside the building)? Doesn't this process in effect reverse the effects of entropy? Inforazer 13:19, 13 September 2006 (UTC)

The AC unit also plugs into the wall. Drawing electricity and converting it to heat increases entropy. Hope this helps. 192.75.48.150 14:44, 13 September 2006 (UTC)
That's right - The entropy of a CLOSED SYSTEM always increases. A closed system is one in which no material or energy goes in or out. You have electricity flowing into your room and besides, the air conditioner is dumping heat to the outside. The entropy loss you experience in the cool room is more than offset by the entropy transferred to the outside and the entropy gain at the electrical power plant. PAR 15:51, 13 September 2006 (UTC)
See heat pump for further more detailed analysis Jheald 17:15, 13 September 2006 (UTC).
An AC unit does not violate the 2nd law. In fact, it is a net emitter of heat; for every 1 unit of heat taken from inside a room, roughly 3 units of heat are added to the outside air (depending on the energy efficiency). Internally, the AC unit always transfers heat from warm to cold, by playing with the pressure of the fluid inside the AC unit. Roughly summarised assume the refrigerant, say Freon, starts at the hot outside temperature. The Freon is compressed, thus raising its temperature. The Freon is then cooled to the outside hot temperature. The Freon is then decompressed, which cools it well below the outside air temperature; the system is also designed to cool the Freon below the inside room temperature. This cool Freon is then used to absorb as much heat from the inside room. And from there the cycle continues. The AC unit taken individually, it may seem like a violation of 2nd law, but as a whole, a lot more energy was wasted on making that one room cooler. Frisettes 06:01, 8 November 2006 (UTC)

[edit] A Tendency, Not a Law

I cut the following false part, being that it is a mathematical fact that when two systems of unequal temperature are adjoined, heat will flow between the two of them, then at thermodynamic equilibrium ΔS will be positive (see: entropy). Hence, the 2nd law is a mathematical consequence of the definition S = Q/T.--Sadi Carnot 14:55, 17 September 2006 (UTC):

The second law states that there is a statistical tendency for entropy to increase. Expressions of the second law always include terms such as "on average" or "tends to". The second law is not an absolute rule. The entropy of a closed system has a certain amount of statistical variation, it is never a smoothly increasing quantity. Some of these statistical variations will actually cause a drop in entropy. The fact that the law is expressed probabilistically causes some people to question whether the law can be used as the basis for proofs or other conclusions. This concern is well placed for systems with a small number of particles but for everyday situations, it is not worth worrying about.
I restored the above paragraph, not because I think it deserves to be in the article, but it certainly should not be deleted for being false. The above statements are not false. The second law is not absolutely, strictly true at all points in time. The explanation of the second law is given by statistical mechanics, and that means there will be statisitical varitions in all thermodynamic parameters. The pressure in a vessel at equilibrium is not constant. It is the result of particles randomly striking the walls of the vessel, and sometimes a few more will strike than the average, sometimes a few less. The same is true of every thermodynamic quantity. The variations are roughly of the order of 1/√N where N is the number of particles. For macroscopic situations, this is something like one part in ten billion - not worth worrying about, but for systems composed of, say, 100 particles it can be of the order of ten percent. PAR 15:15, 17 September 2006 (UTC)

Par, I assume you are referring to the following article: Second law of thermodynamics “broken”. In any case, the cut paragraph is not sourced and doesn’t sound anything like this article; the cut paragraph seems to contradict the entire Wiki article, essentially stating that the second law is not a law. If you feel it needs to stay in the article, I would think that it needs to be sourced at least twice and cleaned.

Furthermore, both the cut section and the article I linked two and the points you are noting are very arguable. The basis of the second law derives from the following set-up, as detailed by Carnot (1824), Clapeyron (1832), and Clausius (1854):

With this diagram, we can calculate the entropy change ΔS for the passage of the quantity of heat Q from the temperature T1, through the "working body" of fluid (see heat engine), which was typically a body of steam, to the temperature T2. Moreover, let us assume, for the sake of argument, that the working body contains only two molecules of water.

Next, if we make the assignment:

 S= \frac {Q}{T}

Then, the entropy change or "equivalence-value" for this transformation is:

 \Delta S = S_{final} - S_{initial} \,

which equals:

 \Delta S = (\frac {Q}{T_2} - \frac {Q}{T_1})

and by factoring out Q, we have the following form, as was derived by Clausius:

 \Delta S = Q(\frac {1}{T_2} - \frac {1}{T_1})

Thus, for example, if Q was 50 units, T1 was 100 degrees, and T2 was 1 degree, then the entropy change for this process would be 49.5. Hence, entropy increased for this process, it did not “tend” to increase. For this system configuration, subsequently, it is an "absolute rule". This rule is based on the fact that all natural processes are irreversible by virtue of the fact that molecules of a system, for example two molecules in a tank, will not only do external work (such as to push a piston), but will also do internal work on each other, in proportion to the heat used to do work (see: Mechanical equivalent of heat) during the process. Entropy accounts for the fact that internal inter-molecular friction exists.

Now, this law has been solid for 152 years and counting. Thus, if, in the article, you want to say “so and so scientist argues that the second law is invalid…” than do so and source it and all will be fine. But to put some random, supposedly invalid and un-sourced, argument in the article is not a good idea. It will give the novice reader the wrong opinion or possibly the wrong information. --Sadi Carnot 16:55, 17 September 2006 (UTC)

I think that the second law of thermodynamics is definitely supposed to be a law, considering that it is the convergence to a final value that is of interest here, not intrinsic statistical variations. There are three possible cases. Convergence over time (for an isolated system etc.) to a state of (1) lower entropy, (2) constant entropy, (3) higher entropy. The second law of thermodynamics states that only (2) and (3) are possible. If it would have stated that all (1), (2) and (3) are possible, just that (2) and (3) are more probable than (1), it would have probably not been taken seriously by contemporary scientists. --Hyperkraft 18:43, 17 September 2006 (UTC)

First, let me say that I agree, the title of the section should not have been "A tendency, not a law". I don't think we should give the impression that the second law is seriously in question for macroscopic situations.

Regarding the article "Second Law of thermodynamics broken", I think that is just a flashy, misleading title for what is essentially a valid experiment. I had not seen that article, however, so thank you for the reference. Note at the end of the article, the statement that the results are in good agreement with the fluctuation theorem, which is just a way of saying that the degree of deviation from the second law is just what would be expected from statistical mechanics. The difference in this experiment is that entropy is being measured on a very fine time scale. This makes all the difference. Entropy variations are not being averaged out by the measuring process, and so the "violations" of the second law are more likely to be revealed, not because stat mech is wrong, but because the measurement process is so good. Check out the "Relation to thermodynamic variables" section in the Wikipedia article on the Partition function to see how statistical variations in some thermodynamic variables at equilibrium are calculated.

Regarding the example you gave, I am not sure how it relates to what we are talking about. The two bodies on either side of the working body are, I assume, macroscopic, containing very many particles, so the second law will of course hold very well for them, even if the working body only has two molecules. Its the working body which will have large variations in thermodynamic parameters, assuming that thermodynamic parameters can even be defined for such a small body. If you divide the working body into four quadrants, then having both particles in the same quadrant will be unlikely, and such a state will have a low entropy. Statistical mechanics says, yes, it will happen once every few minutes, and the entropy will drop, but on average it will be higher. When you say the second law is strict, you are effectively saying that both particles in the same quadrant will never happen, which is wrong. If the working body contained 10^20 particles, then statistical mechanics says they will all wind up in the same quadrant maybe once every 10^300 years and the entropy will drop, but on average it will be higher. When you say the second law is strict, you are effectively saying that it will never happen, and you are practically correct. PAR 15:24, 18 September 2006 (UTC)

I gave the article a good cleaning, added five sources, and new historical material. Your points are valid. My point is that classical thermodynamics came before statistical thermodynamics and the second law is absolute in the classical sense, which makes no use of particle location. Talk later: --Sadi Carnot 15:45, 18 September 2006 (UTC)

[edit] Information entropy

I removed "information entropy" as an "improper application of the second law". I've never heard it said that all information entropy is maximized somehow. Just because physical entropy (as defined by statistical mechanics) is a particular application of information entropy to a physical system does not imply that the second law statements about that entropy should apply to all types of information entropy. PAR 06:38, 22 September 2006 (UTC)

Thanks for removing the sentence "One such independent law is the law of information entropy." As you say, there's no such law. However this is a common misunderstanding of evolution and while I appreciate the reluctance to mention the creation-evolution controversy, the point needs to be clarified and I've added a couple of sentences to deal with it. ..dave souza, talk 10:30, 27 September 2006 (UTC)

[edit] Intro

While the whole article's looking much clearer, the intro jumped from heat to energy without any explanation, so I've cited Lambert's summary which introduces the more general statement without having to argue the equivalence of heat and energy. The preceding statement that "Heat cannot of itself pass from a colder to a hotter body." is the most common enunciation may well be true, but it may be an incorrect statement as is being discussed at Talk:Entropy#Utterly ridiculous nitpick. Perhaps this should be reconsidered. ..dave souza, talk 09:36, 27 September 2006 (UTC)

The section header says it all. Let's not get carried away. We don't get this sort of thing with the first law because of general relativistic considerations, for example. 192.75.48.150 14:55, 27 September 2006 (UTC)
I have reverted this, Lambert's energy dispersal theories are at Votes for Deletion; see also: Talk:Entropy. --Sadi Carnot 16:08, 7 October 2006 (UTC)

[edit] Applications Beyond Thermodynamics + Information entropy

Jim62sch, I may guess where you are coming from, but I'm unclear whether the direction you are heading to is a good idea. What is the meaning of "beyond thermodynamics" where most notable physicists think, that whatever else happens, thermodynamics will hold true?

Also, defined in terms of heat, is a bit single minded. It can well be defined in terms of imformation loss or neglect.

Obviously (to all but the contributors of The Creation Science Wiki, perhaps), nothing of this prohibits evolution and requires divine interverntion.

Pjacobi 20:22, 7 October 2006 (UTC)

"Beyond thermodynamics is a place-holder" -- it can be called whatever works best for the article (I didn't like "improper" though). Also, I don't think the section is perfect, it can clearly be improved, but it does raise a very valid point see below.
Information loss in thermodynamics? Neglect in thermodynamics? Nah, that's going somewhere else with 2LOT that one needn't go. Problem is, entropy (so far as thermodynamics) really isn't as complex as many people want to make it.
I agree wholeheartedly with your last statement, and I recoil in disgust everytime I see the alleged entropy/divine intervention nexus, but I'm not sure where you're going with the comment. Did I miss something? •Jim62sch• 20:46, 7 October 2006 (UTC)
I recently fell in love with the lucent explanation given by Jaynes in this article. Otherwise - I hope at least - I'm firmly based on the classic treatises given by Boltzmann, Sommerfeld and others (Becker, "Theorie der Wärme", was a popular textbook when I did hear thermodynamics). --Pjacobi 21:09, 7 October 2006 (UTC)
I removed a sentence from this section - the "prevention of increasing complexity or order in life without divine intervention" absolutely does not stem from the idea that the second law applies to information entropy. PAR 00:01, 8 October 2006 (UTC)
Also - absolutely, that article by Jaynes is one of the most interesting thermodynamics articles I have ever read. PAR 23:02, 9 October 2006 (UTC)

I removed the following:

The typical claim is that any closed system initially in a heterogeneous state tends to a more homogenous state. Although such extensions of the second law may be true, they are not, strictly speaking, proven by the second law of thermodynamics, and require independent proofs.

Last time I checked the laws of science are true until proven otherwise and they do not "claim" they "state". This sentance is unsourced; I replaced it with two textbook sources. --Sadi Carnot 11:15, 9 October 2006 (UTC)

[edit] General remark

I've just compared an old version [3] and have to say, that I wouldn't judge article evolution to be uniformly positive since then. What happened? --Pjacobi 13:13, 9 October 2006 (UTC)

Crap editing is my guess. I would support your reversion to that version if you feel so inclined. KillerChihuahua?!? 13:16, 9 October 2006 (UTC)
The last ten months of edits have been crap? I'm taking a wiki-break, I had enough of you and your derogatory comments. --Sadi Carnot 13:35, 9 October 2006 (UTC)
Probably not all; however if the article has degraded yes I'd say crap editing is the culprit. It happens to many articles. Reverting to a previous version and then incorporating the few edits which were improvements is one method of dealing with overall article degradation. Why on earth is my agreeing with Pjacobi and offering a suggested approach to dealing with article degradation be a "derogatory comment"? It is a positive approach to cleaning out a cumulative morass of largely poor edits. KillerChihuahua?!? 13:54, 9 October 2006 (UTC)

[edit] The Onion

Sadi recently added[4] The Onion as a source. While I agree the statement might need sourcing, The Onion is a parody site and as such does not meet WP:RS, and cannot be used here. I request an alternative ref be located which meets RS if one is deemed necessary for the statement. Alternatively, a brief statement in the article such as "This has been parodied on the humor site, The Onion" with the ref there would keep the pointer to the Onion article without using it as a source for a serious statement. KillerChihuahua?!? 14:19, 9 October 2006 (UTC)

Yeah, I saw that one too. This quote is embarassing... either this is a stunt, or the person who initially wrote this has no sense of humour at all and a very poor view of god-fearing folks. I am deleting the reference now.Frisettes 06:03, 8 November 2006 (UTC)

[edit] Entropy

Until recently this statement was included:
A common misunderstanding of evolution arises from the misconception that the second law applies to information entropy and so prevents increasing complexity or order in life without divine intervention. Information entropy does not support this claim which also fails to appreciate that the second law as a physical or scientific law is a scientific generalization based on empirical observations of physical behaviour, and not a "law" in the meaning of a set of rules which forbid, permit or mandate specified actions.
PAR deleted it as stated above: I've not reinstated it as in many cases it may be a misunderstanding about "disorder", but as shown here the common argument is the opposite of the article statement of some "reasoning that thermodynamics does not apply to the process of life" which I've only seen in the Onion parody. The Complex systems section later addresses the point indirectly, but the Applications to living systems at present could give comfort to this creationist position as its reference to "complexity of life, or orderliness" links to a book without clarification of the point being made. ..dave souza, talk 15:41, 9 October 2006 (UTC)
Good grief, I remember that - it took what, about six months of discussion with Wade et al to get that wording worked out? From what I see PAR's argument for removing it is that he hasn't ever heard of this misunderstanding - is that correct? KillerChihuahua?!? 18:49, 9 October 2006 (UTC)
No, his argument is [5] and I'm inclined to agree. --Pjacobi 19:04, 9 October 2006 (UTC)
2LOT is used by creationists though, and it is via a misunderstanding. The big bruhaha was about 1) whether to include this in the 2LOT article at all, and 2) How to phrase it if included. Wade was the big protagonist for inclusion as I recall - and I dropped out of the arguments here for a while and wasn't part of any phrasing discussion. Oddly enough, on Talk:Second law of thermodynamics/creationism#Straw poll PAR was for inclusion (I was against) so I'm wondering why he didn't just edit it to make better sense rather than removing it. KillerChihuahua?!? 19:22, 9 October 2006 (UTC)
I'm in favor of a reasoned discussion of the creationists point of view, but the statement I removed was simply false. I don't know how to make it make better sense. PAR 22:59, 9 October 2006 (UTC)
Well I was not for including it in the first place, but the last straw poll showed opinion was fairly evenly split so I'm not going to argue this. Unfortunately, I cannot seem to figure out a way to write the position clearly. Can anyone write a simple statement which gives the Creationist use, and makes clear it is erroneous? KillerChihuahua?!? 23:05, 9 October 2006 (UTC)
Suggestion:
A common misunderstanding of evolution arises from the misconception that the second law requires increasing "disorder" and so prevents the evolution of complex life forms without divine intervention. This arises from a misunderstanding of "disorder" and also fails to appreciate that the second law as a physical or scientific law is a scientific generalization based on empirical observations of physical behaviour, and not a "law" in the meaning of a set of rules which forbid, permit or mandate specified actions.
For an example of the reasoning that this deals with, see User:Sangil's comment of 22:04, 14 May 2006, at Talk:Evolution/Archive 016#Kinds. Any suggestions for clarification welcome. ..dave souza, talk 15:06, 11 October 2006 (UTC)

[edit] Mysterious edit comment:

Kenosis wrote "Removing "laws of thermodynamics" template until admins can fix the inadvertent insertion of Jane Duncan into the text"

Do you men this: [6]

That is obviously plain minor vandalism and should be reverted by any user, no admin intervention needed.

Pjacobi 08:54, 11 October 2006 (UTC)

Thank you for repairing it. I didn't know until now that we could access the templates without admin privileges. Appreciate it. ... Kenosis 13:11, 11 October 2006 (UTC)

[edit] Mathematical origin ?

I've cut the following addition by Enormousdude:

Second law of thermodynamics mathematically follows from the definitions of heat and entropy in statistical mechanics of large ensemble of identical particles.

Apart from the quibble that ensembles are made of large numbers of identical systems not large numbers of identical particles, it seems to me the statement above is not true. The 2nd law does not follow mathematically from the definitions of heat and entropy in statistical mechanics. Mathematically it requires additional assumptions establishing how information is being thrown away from unusually low entropy initial conditions.

Even if the statement were correct, it adds little value to the article in the absence of any indication as to how the second law might "follow directly from the definitions of heat and entropy". Jheald 21:10, 2 November 2006 (UTC)

[edit] Axiom of Nature

Since when has nature had axioms? This seems very poorly worded. LeBofSportif 13:57, 14 November 2006 (UTC)

[edit] Please give a better statement of the second law

Please give a better statement. The statement of the second law as it is given at this time (nov 27 2006) on wikipedia english is inaccurate. Unfortunately, this form seems to have become more and more fashionable. This has to change!!!!!

Proper statements could be:

"The entropy of a closed adiabatic system cannot decrease".

"For a closed adiabatic system, it is impossible to imagine a process that could result in a decrease of its entropy"

This is documented in a number of textbooks, such as Reif, fundamental of statistical and thermal physics, Ira N Levine, Physical chemistry and others.

Consider a system that exchanges heat with a single constant temperature heat source. For a cyclic process of such a system, we have w>=0 and q=<0. (Kelvin, equalities correspond to a reversible process, the inequalities correspond to an irreversible process)

Now we consider an irreversible change of a closed adiabatic system from state I to state F. Let WIF be the work done (algebraic)on the system during this process. Let us now imagine a reversible process that brings the system back from state F to state I and during which we allow the system to eventually exchange heat with a heat revervoir at temperature Ttherm.

The first law allows us to write:

wIF +wFI +qFI=0. If the system were to be adiabatic during this reversible process, then qFI would be zero and the total work done would be be zero as well. This would imply that the cycle is reversible (Kelvin)and this cannot be the case since the process IF is irreversible. Hence qFI cannot be zero. It is therefore negative. The reversible FI process can only be a reversible adiabatic process to bring the temperature of the system to the temperature of the heat reservoir, followed by an isothermal process at the temperature of the heat reservoir and finally an other reversible adiabatic process to bring the system back to state I. The entropy change of the system during the reversible process is SI - SF= qFI/Ttherm <0.

This demonstrates that SF - SI>0 even if the closed adiabatic system exchanges energy with its surroundings under the form of work.

This shows the validity of the second law as stated just above. It is much more powerful than the "isolated" system form since the system can exchange work with its environnment.

A simple example would be the compression (p outside larger than p inside)or the expansion (p inside larger that p outside)of a gas contained in an adiabatic cylinder closed by an adiabatic piston. With the "isolated system" formulation, you cannot predict the evolution of such a simple system!! Ppithermo 23:50, 27 November 2006 (UTC)


An isolated system cannot transfer energy as work.
PAR 01:35, 28 November 2006 (UTC)
I think that Ppithermo's point is that a more useful statement of the second law would involve the relationship between work and heat as types of energy transfer, and an isolated system involves neither. An isolated system represents only one special case (the work-free case) of a closed, thermally insulated system. In the more general sense, imagine a system that is the contents of a sealed, insulated box with a crank attached to get shaft work involved. No matter what is inside that box (generators, refrigerators, engines, fuels, gases, or anything else), I can turn the crank all I want to perform work on the system or I can let the system do all it can to turn the crank and perform work for me, and the second law says that neither case can possibly decrease the total entropy of the system inside that box. Entropy can decrease locally, but system entropy can't when the system is closed and thermally insulated, whether it's isolated or not. All my input of shaft work won't decrease entropy because it's easy for work to make things hotter but it's impossible for hot things to do work unless they exchange heat with cold things. If I want to decrease entropy in there, heat or mass must be transfered to the outside. Work won't do it.
If the statement of the second law uses an isolated system, that information is lost. I wouldn't favor a statement that uses the phrase "it is impossible to imagine..." but I do like the idea of replacing "isolated system" with "closed system with an adiabatic boundary". Still, there always has to be a balance between making general, powerful statements and overly-specific, easily understood statements. Flying Jazz 04:33, 28 November 2006 (UTC)
Well, I put that statement in because that's the statement I've always heard, and I knew it was true. However, I do think you have pointed out a lack generality. I can't come up with a counter argument (right now :)). Why not change it to the "adiabatic closed" form, and meanwhile I will (or we could) could research it some more?
I looked at Callen (Thermodynamics and thermostatistics) which is what I use as the bottom line unless otherwise noted, and he does not deal with "the laws". He has a number of postulates concerning entropy, and the one that is closest to the second law states:

"There exists a function (called the entropy S) of the extensive parameters of any cmmposite system, defined for all equilibrium states and having the following property: The values assumed by the extensive parameters in the absence of an internal constraint are those that maximize the entropy over the manifold of constrained equilibrium states."

I've come to realize that my understanding of the role of constraints in thermodynamic processes is not what it should be. I don't fully get the above statement, so I will not put it into the article, but perhaps you can use it to modify the second law statement. PAR 14:42, 28 November 2006 (UTC)
Ppithermo 22:02, 28 November 2006 (UTC)Well. This is the problem. So many books especially the more recent ones have it wrong. I have had many hard times from so many people who do not aggree with the closed adiabatic statement because they were taught the isolated form and never questionned it. They often think it's a simple detail. If PAR wrote the statement then I guess should modify it. I find that this is not a difficult concept and closed adiabatic is as simple to comprehend as isolated I believe. I think the small demonstration of the the fact that SF - SI>0 should be included below the statement since it illustrates it well. Ppithermo 22:02, 28 November 2006 (UTC)
I think it should be included but not in the introduction, better in the overview. The introduction is for people who want to get a quick idea of the article. PAR 01:12, 29 November 2006 (UTC)

[edit] Rigor about isolated vs closed adiabatic

I've found a source that make me think Ppithermo may be too emphatic about some of this. First off, "isolated" is not wrong. It is true. It just seemed to be more specific than it needed to be. Second, the equation SF - SI>0 is not needed when dS/dt >= 0 is in the current version. I think PAR's post of Callen's statement is too dense for a Wikipedia article. When I googled and found http://jchemed.chem.wisc.edu/Journal/issues/2002/Jul/abs884.html yesterday, I thought of simply changing the words "isolated system" to "closed system with adiabatic walls" in the introduction, but there was a problem with that. I don't think a closed adiabatic system involved in a work interaction can reach equilibrium until the work interaction stops, so refering to a maximum entropy at equilibrium would be inappropriate unless the system is isolated. And the introduction should refer to equilibrium in my view. Another huge advantage of using the "isolated system" second law statement is that it's commonly used. But that still doesn't mean it's a good thing for Wikipedia.

So I was confused and looking for some rigor about this until I saw page 5 here: http://www.ma.utexas.edu/mp_arc/c/98/98-459.ps.gz. It says the following (I've simplified the state function algebra some and I'm not bothering with MathML here):

"Second law: For any system there exists a state function, the entropy S, which is extensive, and satisfies the following two conditions:

a) if the system is adiabatically closed, dS/dt >= 0
b) if the system is isolated, lim (t->infinity) S = max(S)

...Part (a) of the 2nd law, together with the 1st law, gives the time evolution. On the other hand, part (b), which defines the arrow of time, expresses the fact that the isolated system evolves towards a stable equilibrium in the distant future....Furthermore, part (b) of the second law, together with the 1st law, yields all the basic axioms of thermostatics, as well as the zeroth law of thermodynamics."

The article that contains this quote is about a problem of thermostatics, so that might be why part (b) is emphasized, but it looks to me like we don't have a general/specific issue here, but we have two sides of the same (or a similar) coin, and if one side is presented more often in other sources, I think Wikipedia should follow suit in the intro. Presenting both (a) and (b) in the Mathematical Description section will make things clear. For now, I've changed "isolated" to "closed" in the Mathematical Description section and I hope someone else can add the math. I think the current statement of the law in the introduction is fine. Flying Jazz 04:44, 29 November 2006 (UTC)

The statement with "isolated" is correct and a common statement, so its ok to keep it there. What bothers me is that I am under the impression that the laws of thermodynamics, properly stated, give all the "physics" needed for thermodynamics. If you have a rigorous description of the mathematics of the space of thermodynamic parameters, and a tabulataion of properties, then that, in combination with "the laws" should enable you to solve any thermodynamics problem. I haven't gone through the argument above enough to be sure that the "isolated" statement is insufficient in this respect. If it is not sufficient, and there is an alternative, accepted statement of the second law which IS sufficient, that would be an improvement.
I also agree that simply replacing "isolated" with "closed adiabatic" would be wrong, thanks for pointing that out. The system would never equilibrate if work was being done on it. PAR 07:21, 29 November 2006 (UTC)


Sorry , i am not sure how to make this a new paragraph.
I aggre with everyone here that the isolated statement, as it is, is true, never said it was not. My point is that it does not constitute a proper second law.
I actually am not too happy with the presence of dS/dt. Remember, themodynamics is supposed to be about equilibrium states. If there is an evolution of a system (in general irreversible), the thermodynamic variables may not be defined or ill defined.
The following sentence in the above comment in certainly not justified.

I also agree that simply replacing "isolated" with "closed adiabatic" would be wrong, thanks for pointing that out. The system would never equilibrate if work was being done on it.

The argument that "the system would never equilibrate" is rather bizarre. The system evolves from a state of equilibrium to another state of equilibrium when a constraint is modified(for example external pressure). It may take a very long time for this evolution to take place, but we only consider the states that correspond to equilibrium. For example, if the external pressure is different from the internal pressure(mobile boundary), the (closed adiabatic)system evolves and tends to bring the internal pressure to the same value as the external pressure. When this is achieved work interaction will cease. And the system can reach equilibrium.
How could you possibly deal with a closed adiabatic system with work exchange using the "isolated formulation"...Please explain if you can. My opinion is that you cannot.
The other point is that the isolated statement is not equivalent to the Kelvin formulation(you cannot have w>0 and q<0 for a system in contact with a thermal reservoir for a cyclic process. Everyone seems to agree with the Kelvin formulation). The Kelvin formulation cannot be derived from the "isolated formulation". How would you bring about the work term ??
If you feel that the derivation I outlined to show that SF - SI>0 is erroneous please indicate so and possibly why. (It is so straight forward...)
I have various sources which have similar or equivalent derivations. (Please look ar the Reif book which is a classic for physicists he writes that that the entropy of a closedd thermally insulated system...).
The paper cited by PAR asserts the fact with no proof, and unfortunately, I could not find a derivation...
Ppithermo 11:25, 29 November 2006 (UTC)
I hope you don't mind that I formatted your post above. When you respond, just indent your response from the previous, using colons, until about four or five colons, then start back at the margin. Each new paragraph has to be indented.
Regarding your response above:
  • I don't understand your objection to the sentence "The system would never equilibrate if work was being done on it." Then you give an example which says that a closed, adiabatic system can come to equilibrium as long as no more work is being done on or by it, which is just what the sentence you object to says. A closed, adiabatic system which has no work done on or by it is effectively an isolated system.
  • Please understand that I am not rejecting the point you are making, I just haven't had time to understand it completely. In fact, I think you may have a point. I cannot dispute your explanation right now, but I will look at it more carefully and get back to you.
  • The reference that I cited is not a paper, its a book by Herbert B. Callen. It is the most widely cited thermodynamics reference in the world, and its the best thermodynamics book I've ever seen. (I will check out the Reif book and see how they compare). The statement I quoted was not a theorem, it was a postulate, much like the second law. There is no "proof" of the second law other than experience.
PAR 14:39, 29 November 2006 (UTC)
Thanks for the formatting. I was hoping one of you would help me with that.
I guess confusion sets in because I am trying to reply to two persons and I do not say who I am responding to. I will now respond to PAR and the different points made in the last response.
The statement I objected to seemed to imply that if work is done on a system, the system cannot reach equilibrium, which made no sense to me. Clearly when the system has reached its final state, it is in a state of equilibrium. No work is done on it anymore, in fact no change takes palce anymore. But work may have been done between the initial and final state. (good old example of the cylinder and piston both adiabatic and different pressure inside and ouside in the initial state prior to releasing the piston).
I know Callen's book. I have the second edition.
When the second law is stated for an "isolated system", you cannot derive from it the Kelvin form of the second law. When stated for a "closed adaibatic system", you can.
In the demonstration I presented, the system spontaneously goes from an initial state of equilibrium I to a new equilibrium state F after a constraint is modified. If the entropy S had not attained its maximum the spontaneous process would carry on. The process thus stops because the entropy of the system has reached its maximum, given the constraints.
Take time to reflect on the stuff. With this statement of the second law, it is a breeze to derive the Clausius inequality for system that can enter in contact with multiple thermal reservoir.
Also note that I favor writing equations for finite processes rather than differential forms since as I mentioned it earlier thermodynamic variables are well defined for equilibrium states.
Reif's book is also one of the most known thermo stat book. I do not have my copy here, but I remember that he gives a summary of the four laws and there, he clearly states that the maximum entropy principle holds for a thermally insulated system. Ppithermo 22:31, 29 November 2006 (UTC)
It's my view that complete rigor in both thermodynamics and thermostatics (a "proper second law") cannot be attained without both part (a) and part (b) as stated above. Either work interactions are permitted (closed adiabatic) or they're not (isolated). Of course, Ppithermo is correct when he says that closed adiabatic systems are ABLE to reach equilibrium after the work interaction stops, but in a discussion about closed adiabatic vs isolated in a second-law statement, this is a little bit sneaky. Open systems with everything changing are able to reach equilibrium too--after the mass transfer and thermal contact and work interactions stop. Whether to express equations in differential form or not is a matter of taste and I think it depends on the specific equation. dS/dt >= 0 looks more elegant in my opinion than Sf - Si >= 0. In the Carnot cycle article, using differentials instead of Sa and Sb would be silly.
I hope you both focus on parts a and b of the second law statement above. For a closed, adiabatic system, entropy will increase (part a), but, as long as the work interaction continues, this says nothing about whether a maximum will be achieved. If a work interaction could continue forever then entropy would increase forever. You need part b to discuss equilibrium.
For an isolated system, entropy will reach a maximum with time after equilibrium is achieved (part b). But this says nothing about the futility of work interactions in creating a perpetual motion machine of the second kind. It says nothing about work interactions at all. You need part a to discuss that. Flying Jazz 23:55, 29 November 2006 (UTC)
answer to Flying Jazz.
Actually,I do not like to lean on the paper you refer which is an assertion and does not contain a demo. You can consider it as an axiomatic approach, but the equivalence with other forms is far from demonstrated.
dS/dt >= 0 may look more elegant but is not appropriate. Suppose you place in contact two objects that are at different temperatures, there will be an irreversible heat exchange and during the process, the temperature of the objects will be inhomogeneous, thermodynamic variables will not be well defined. They become well defined when equilibrium is reached. This is why I object to the dt.
Now let us get back to the work problem.
1) work done on a system or by a system is a frequent occurence.
2) consider a cyclic process of a machine during which the machine receives a work w (w may be postive or negative at this stage)
3) Suppose that we allow the machine for part of the cycle to come in contact with a single thermal reservoir at temperature T and that during this contact the machine receives an amount of heat q. The thermal reservoir receives an amount of heat
qtherm= - q
All the thermodynamic variables of the machine are back to their original values at the end of the cycle. The change in internal energy of the machine is zero. Hence
w+q=0 and (SF - SI)sys=0
The entropy change of the thermal reservoir is
(SF - SI)therm = -q/T
The composite system machine + thermal reservoir constitutes a closed adiabatic system. We have
(SF - SI)composite = -q/T
Now using the closed adiabatic formulation of the second law, we must have
(SF - SI)composite>0
and therefore -q/T>0 thus q<0 and w>0
we get back the Kelvin statement of the second law, which says we cannot have q>0 and w<0.
With the isolated system formulation, you cannot get anywhere because you have no provision made to deal with work!!
Hope this is convincing enough. Please read it carefully. Hope PAR reads this too and appreciates
Ppithermo 14:22, 30 November 2006 (UTC)
I think you are being a little sneaky again, Ppithermo. Two objects at different tempartures in thermal contact with each other is not a closed adiabatic system, so part (a) above would not apply. If you are against the differential statement, please consider a closed, adiabatic system with a work interaction. I agree with everything you write after that, including your last paragraph, but this does not get to the heart of the issue of what should be in the introduction or the bulk of the article.
In engineering, the total entropy of systems that are out of equilibrium is found very often. Spatial inhomogeneities are accounted for by making certain assumptions and then you integrate over some spatial dimension, and, presto! You have a value for thermodynamic parameters for the contents of a cylinder of a steam engine at one non-equilibrium moment or another. The idea that it's a challenge to rigorously compute non-equilibrium thermo parameters, and therefore differentials should not be used is not compelling. Differentials are too useful and the mathematical version of the plain-English statement "entropy increases" is both more concise and more general than "entropy's final value is greater than it's initial value." Every result that uses the second statement can be derived from the first, so your cyclic example, while correct, is not an argument against using differentials. Also, a statement of the second law that only uses equilibrium values would permit a perpetual motion machine of the second kind if that machine never achieved equilibrium. Perpetual motion machines are impossible whether they achieve equilibrium or not, and a good second-law statement would reflect this.
I came accross the abstract for this recent article during a google search: http://www.iop.org/EJ/abstract/1742-5468/2006/09/P09015 that shows that this topic is a subject of current theoretical study. These stat mech guys assumed something-or-other and concluded: "We also compute the rate of change of S, ∂S/∂t, showing that this is non-negative [see part a above] and having a global minimum at equilibrium. [see part b above]." I don't have access to the complete article and I probably wouldn't understand it anyway (I'm not a stat mech guy), but when I see the same things repeated often enough just as I learned it, and it makes sense to me, I don't mind seeing it in Wikipedia even if a rigorous demo with no assumptions doesn't exist at the moment. With my engineering background, these kinds of conversations sometimes make me smile. Why can't these physicists get around to rigorously proving what we engineers have already used to change the world? :) My old chemE thermo text used something called "the postulatory approach" which goes something like this: We postulate this is true. It seems to work and we've used it to make lots of stuff and lots of money. Hey, let's keep using it. Flying Jazz 02:04, 1 December 2006 (UTC)
Answer to Flying Jazz
first paragraph
I am not trying to be sneaky. I was just giving an example where temperatures are inhomogeneous. I can have a closed adiabatic system. Two gases at different temperature in a closed adiabatic container and separated by an adiabatic wall. Remove the wall. Clearly temperatures will not be homogeneous...
I agree of course with the fact that,with certain assumptions, one can calculate entropies and state functions for system where they are not exactly homogenous. The trouble with that is that most people will not remember what the assumptions are!!!
As a presentation, things should remain simple yet attempt to be accurate. The point I was trying to make in my last message is that from the closed adiabatic statement, you can get the Kelvin formulation and reciprocally(in a previous part) from the Kelvin formulation you get the closed adiabatic statement.
Please note that the fact people repeat things they heard somewhere else do not make them right.
I do not understand your statement about the perpetual motion machine of the second kind. I precisely showed that with a single thermal reservoir you cannot get w<0 which is equivalent to no useful work can be produced...
Consider a machine that undergoes cyclic changes and can exchange heat with 2 heat sources. You can have an engine that runs forever and does not break the second law at all. The global system machine plus thermal reservoirs constitutes a closed adiabatic system. The entropy of the system is unchanged after each cycle. The entropy of the hot reservoir decreases and that of the cold reservoir increases. We have for the entropy change of the global system for one cycle
-qh/Th--qc/Tc>0
The global enrtopy change is >0 and of course w<0, work is provided to the user. All of this is perfect agreement with the closed adiabatic formulation. As long as your hot reservoir can supply heat and the cold one receive heat, you machine can do work for you.
No one can show this with the isolated system formulation. Hence the beginning of this discussion. If you know a book where this is shown other that a 100 page paper with hundreds of assumptions please tell me.


Finally, since you have some engineering background, you might be willing to look at some phys chem books. If you can put you hands of Physical Chemistry, by Robert Alberty, seventh edition look up page 82. Another book is by Ira N Levine, Physical Chemistry. I do not have it handy so I cannot give you the exact page number.
Ppithermo 10:57, 1 December 2006 (UTC)
If you look at the paragraph available useful work, the starting point is the second law for adiabatic closed system. If I can find some time I will try to fix that first statement so that it fits as nicely as possible with the rest of the article.
Ppithermo 14:28, 2 December 2006 (UTC)

Well, I finally had time to go through the above discussion, and I still don't understand everything. When I think of the closed adiabatic system, I think of a system that could be incredibly complex, all different sorts of volumes, pressures, temperatures, mechanisms, whatever. When Ppithermo says "the closed adiabatic system is brought into contact with a constant temperature heat bath" I lose the train of thought. I can't generalize what happens to an arbitrarily complex system when it is brought into contact with a heat bath.

In trying to come up with an argument, using the isolated form of the second law, which shows that a closed adiabatic system cannot have a decrease in entropy, the best I can do so far is this: suppose we have a closed adiabatic system called system 1. It is mechanically connected to system 2, so that only mechanical (work) energy can be exchanged between the two. We can always define system 2 so that system 1 and system 2 together form an isolated system. It follows that with this definition, system 2 is also closed and adiabatic. So now we have two closed adiabatic systems connected mechanically to each other which together form an isolated system. The sum of the changes in their entropies must be greater than or equal to zero by the isolated form of the second law. But if I want to concentrate on the entropy change of system 1, that means I am free to choose system 2 in any way possible. If I am able to find a system 2 which has no change in entropy when a particular amount of mechanical energy is added/subtracted, then it follows that the entropy of system 1 cannot decrease for this case. But system 1 is independent of system 2, except for the mechanical connection, so if I can find such a system 2, then it shows that the entropy of system 1 cannot be negative for ANY choice of system 2.

The bottom line is that, if you assume that there exists a closed adiabatic system which can have work done on/by it with zero change in entropy, then you can prove, using the isolated form of the second law, that no closed adiabatic system can have a decrease in entropy.

This is all well and good, but what about the statement of the second law in the article? I don't think the statement as it stands is adequate. My above argument would indicate that the isolated statement of the second law needs the proviso of a zero-entropy machine just to prove the closed adiabatic statement of the second law. I don't even know if it can prove Kelvin's statement. Another favorite book of mine is "Thermodynamics" by Enrico Fermi. In this book, Kelvin's statement is given as the second law (no work from a constant temperature source) and so is Clausius' statment (heat can't flow from a cold to a hot body). Fermi then goes on to prove that the two statements are equivalent, and then that the isolated form of the second law follows from these two equivalent statements.

I am starting to appreciate more the quantitative value of the Kelvin and Clausius statements, and I am starting to favor them as statements of the second law. I am not clear whether the closed adiabatic statement of the second law is equivalent to these two statements. Or the "amended isolated" form either, for that matter. PAR 17:17, 4 December 2006 (UTC)

In reading Callen, I find that his statement of the second law (mentioned above) is actually about the same as the isolated statement except more precise, and he then uses "reversible work sources" (what I was calling a "zero entropy machine") to prove the "maximum work theorem". The fact that a system which is only mechanically connected to a work source (reversible or not) cannot have a decrease in entropy is then a special case of the maximum work theorem. Check it out. So the bottom line is that I am ok with the present statement. PAR 01:19, 10 December 2006 (UTC)


Sorry, I have been very busy and unable to come back here.

Your system 1 system 2 thing does not add up for me. A zero entropy change process is a reversible adiabatic process. Note that here you are already making assumptions about a system that exchanges work with its environnement. Note also that the process of system one can be irreversible. How could you posssibly immagine some coupling where work could be done reversibly on one hand and irrversiblys on the other.....

Let's rather concentrate about what could be undestandable to the people at large. An axiomatic statement like that of Callen and many others does not appeal to anyone and can only be understood by a handful of people.

You say that the Kelvin statement is a good one and can be related to easily. I sort of agree with that. How can we make use of that to state the second law. The scheme is the following:

  1. Kelvin statement - A system that can exchange heat with a single constant temperature heat source cannot povide useful work. w>=0 and q=<0. ( and of course w+q=0, the first law)
  2. Using this statement, one can show that it is equivalent to a more powerfull form: The entropy of a closed adiabatic system cannot decrease.

For a reversible adiabatic process the entropy change is zero (example a reversible adiabatic process of an ideal gas) and thus any other adiabatic process (irreversible) will imply an increase of the entropy of the system.

Many conclusions can then be obtained very simply.

For a system that can get in contact with one thermal reservoir, The amount of heat(algebraic) that the system can receive during a reversible process is always greater than that it can receive during an irreversible process between the same states.

qsys rev>= qirrev

One can show the Clausius inequality very easily, for a system undergoing a cyclic process during which it can exhange heat with several thermal reservoirs (don't know how to type an equation)

∑ qi/Ti <=0

Another application. Consider an irreversible engine that operates getting in contact with two thermal reservoirs. After a number of cycle, the engine entropy is unchanged. The entropy of the hot thermal reservoir has decreased. The entropy of the cold thermal reservoir has increased.The total entropy change of the closed adiabatic system made of the engine and the reservoirs has increased, as it should according ot the second law for a closed adiabatic sysytem. So clean and simple.

NONE OF THESE CONCLUSIONS CAN BE OBTAINED WITH THE ISOLATED STATEMENT.(for system where work is exchanged).

I hope you guys can see the light at the end of the tunnel, because this probably all I can do to enlighten you. If you still like the statement as it is, then enjoy it. I can tell you that a few months back I read this second law article and there was a statement saying that "the closed adiabatic form would be the appropriate form". Clearly this has been removed and I do not know how to get to old versions to document that. If you cannot understand my arguments, then others will not either and there is no need for me to change that statement since it will be destroyed immediately by individual who "learned it that way....". Ppithermo 14:57, 15 December 2006 (UTC)


I think you have not understood my argument. Here is a diagram. System 1 is adiabatic and closed. It may or may not be reversible. Its change in entropy ΔS1 may be anything. Now we mechanically connect a reversible work source (system 2) to it. System 2 is reversible, so its change in entropy ΔS2 is zero. The combination of both systems constitutes an isolated system. By the isolated version of the second law,
\Delta S_1+\Delta S_2\le 0\,
Since system 2 is reversible:
\Delta S_2 = 0\,
Therefore
\Delta S_1 \le 0\,
Which proves, using the isolated form of the second law, that a closed adiabatic system (system 1) cannot have its entropy decreased. All of the proofs you showed above then follow. PAR 16:35, 15 December 2006 (UTC)

I understood your argument. There is not such thing as a reversible work source that is isolated... By essence reversible work is only possible if the acting force on the moving part is zero (external force= internal force. Hence a reversible work source cannot be isolated...

I was away for more that 2 weeks and very busy afterward . Sorry, for not answering sooner. Since you like the Callen book, try to get your hand on the first edition and look at appendix C. You will find there that Callen says rightfully I believe that the seocnd law does not allow finding the equilibrium state of two coupled thermally insulated systems.

128.178.16.184 13:31, 26 January 2007 (UTC)

Hi - is this Ppithermo? I agree, a reversible work source is not isolated, but there is no isolated reversible work source in the above diagram. It is an adiabatic closed reversible work source. I am using the following definitions

  • adiabatic means TdS=0
  • closed means μdN = 0
  • isolated means closed, adiabatic, and PdV=0

I have Callen second edition with no Appendix C. (Why would Callen leave that out?). But I have difficulty believing the statement. Is there an explanation included? PAR 15:28, 26 January 2007 (UTC)

Sorry i signed but forgot to actually login. Don't know why Callen left that out of edition 2. Maybe felt a litle unconfortable about it. In the second edition see page 53 problem 2.7.3. Except there, he tels you to show that property. I directed you to appendix C of edition 1 because the explanation is from the MASTER.
my objection to your drawing is for deltaS to be zero on your system 2, the work on that system has to be reversible.
lucky you if given an irreversible process with work involved, you can make that work correspond to reversible work done on the second system..
I think you should stop hanging onto the bad statement.
Look at the site anyway . A few lines below, whoever wrote that part of the text, makes use of the fact that the entropy change for a closed adiabatic system can only be positive. Ppithermo 14:41, 2 February 2007 (UTC)
I looked at problem 2.7.3. and it is for an isolated volume divided by a movable, adiabatic wall into two sub-volumes, with unequal pressures. But Callen only considers the case when both sides are reversible (the wall oscillates) or both are irreversible (viscous forces heat both sides and the wall comes to rest with constant pressure on both sides). He's right - in the irreversible case, you have to know how much energy was given to each side by the viscous processes before you can determine the temperatures, you can't figure it out from the thermodynamic laws alone. However, if we assume that one side undergoes a reversible transformation while the other side does not, then we know that all the viscous dissipation occurred in the irreversible side, and then the situation becomes solvable.
Notice that the "mechanical connection" in the diagram does not have to be a movable adiabatic wall, it can be a crankshaft or any other mechanical device for transferring work energy. It could be a rope connected to a piston in system 1 and connected to a hanging weight in system 2. Any motion of the rope would simply raise or lower the weight, it would not alter the temperature of the rope or weight, and so it would be reversible. All the potential energy you store in the weight by lifting it can be recovered. This would be a reversible work source for system 2 and it would not matter whether it was connected to a reversible or irreversible system 1.
Also, take a look at page 103 in Callen (2nd ed). It has a diagram very similar to the one above, but it is dealing with a more general case. Notice the system labelled "Reversible work source". PAR 23:47, 2 February 2007 (UTC)


I looked at page 103 and I do not like at all his treatment. I still think that it is unreasonable to consider that work irreversibly done on a system can corresponds to reversible work done on another one.
So finally for lack of time and stamina, I give up.
Enjoy your preferred statement of the second law. I 'll enjoy the one I use.
Changing the wikipedia page is pointless.
Wikipedia will probably disappear shortly.
I am writing yet another Thermodynamics Book.... and I'll try to make it better than the previous ones.

Ppithermo 17:13, 20 February 2007 (UTC)

[edit] Second law of thermodynamics in fiction?

I have feeling that the concept of the second law of thermodynamics isn't rare in science fiction works, e.g. The Last Question. We should add a new section about this. Frigo 12:14, 8 December 2006 (UTC)

[edit] "Probable"

I have deleted the "probable" paragraph, since I believe it is inaccurate. This is a slightly tricky point, but because entropy is a function of what we can know about a system (macrostate) rather than a function of phase space (microstate, i.e. the position and momentum of each particle), the second law of thermodynamics always holds. A system might enter an "unusual" microstate purely by chance, but that doesn't mean its entropy has reduced. A system's entropy is only reduced if you know its range of possible microstates has become restricted. —Ashley Y 01:32, 18 February 2007 (UTC)

This seems valid, but is also saying that entropy applies statistically at a macrostate level and not in terms of microstate. There seems to me to be a need for such clarification in the introduction: could you perhaps write a suitable statement? As a non-expert I'd be most grateful.. dave souza, talk 11:19, 18 February 2007 (UTC)
The paragraph should be restored. Its true, the entropy is only defined for a macrostate, and it is proportional to the total number of microstates that could possibly give rise to that macrostate. However, a system does not sit in this "equilibrium" macrostate, it bounces around between a number of macrostates that are nearly the same (for large numbers of particles), and can even, very rarely, jump to a macrostate which is more significantly different from the equilibrium macrostate. I mean, if you measure the pressure of a gas, it has statistical variations, its not an absolute constant. The same is true of the entropy, it has statistical fluctuations, and sometimes will fall, sometimes will increase. The second law is a statistical law and it is not always strictly true. The statement above which says "A system might enter an "unusual" microstate purely by chance, but that doesn't mean its entropy has reduced" is true, as long as the unusual microstate corresponds to the "equilibrium" macrostate. The point that is missed is that a system might also enter an "unusual" microstate which does NOT correspond to the "equilibrium" state, and thus the entropy may decrease. PAR 19:25, 18 February 2007 (UTC)
This is not correct. A macrostate is what one knows about a system, so a system cannot enter another one by microstate variation. For instance, even if all the molecules of a gas move to the left half of a container, purely by chance, it's still in the same macrostate.
Likewise "equilibrium" is a macrostate, i.e. a probability distribution of microstates. All bouncing around of microstates is represented by this distribution, so it cannot enter a new macrostate by itself. —Ashley Y 03:59, 19 February 2007 (UTC)
Please look at the macrostate article. If all the molecules of a gas move to the left half of a container, this is a classical example of a DIFFERENT macrostate. A macrostate is, by definition, a state that is described by the macroscopic thermodynamic variables - pressure, temperature, density, etc. When a gas fills a container, at equilibrium, it can be described by a single constant density, neglecting statistical variations. A gas confined to the left half of a container cannot be described by a single value of density, and therefore, by definition is a different macrostate. PAR 22:33, 19 February 2007 (UTC)
Ah, but all the molecules moving spontaneously to the left is no more than a statistical variation. Density is itself a macrostate function, and that macrostate happens to include a range of microstates, including the one with all the molecules moved to the left.
As the macrostate/microstate article correctly says, "a macrostate is characterized by a probability distribution on a certain ensemble of microstates". A macrostate of equilibrium particles in a box will include microstates that have all the particles in the left half of the box, just as it includes all the other microstates. —Ashley Y 00:06, 20 February 2007 (UTC)

Ok, lets be very clear about this. Are you saying that the following statements are true?

  • A gas that happens to be in the left half of a container (due to a statistical variation) is in the same macrostate as a gas that fills the entire volume.
  • The entropy of a gas that happens to be in the left half of a container (due to a statistical variation) is equal to the entropy of a gas that fills the entire volume.

PAR 11:12, 21 February 2007 (UTC)

The first statement is true. A macrostate is a probability distribution of microstates, and that includes a very small probability for the set of microstates in which the molecules all happen to be in the left half of the container.
The second is true because the macrostate is the same, and entropy is a function of macrostate.
Consider just three molecules in a box. The equilibrium macrostate will include a 1/8 chance that all the molecules are in the left half of the box. But it's still in the same macrostate. If you confine the three molecules to the left by force (for instance by pushing a plunger), then the chance becomes 1 and it's a new macrostate. —Ashley Y 02:48, 22 February 2007 (UTC)
Suppose I have a container with the gas confined by a partition to the left half of the container. Suppose then that the partition is very quickly removed (i.e quickly as compared to the time that it takes for the gas to expand into the right half of the container). Are you saying that the entropy just as quickly jumps to the same value of entropy that the gas will have after it spreads throughout the container? If not, then what is the time development of the entropy for this scenario? PAR 05:01, 22 February 2007 (UTC)
It does not jump step-wise. The time development for the entropy depends on the expected time it takes for the gas to diffuse. This is fast in a vacuum (because the particles are typically moving fast), but not infinitely so. The entropy will asymptotically approach the new value. —Ashley Y 05:27, 22 February 2007 (UTC)
But I asked in my second statement above if "The entropy of a gas that happens to be in the left half of a container (due to a statistical variation) is equal to the entropy of a gas that fills the entire volume." you answered that it was "true because the macrostate is the same, and entropy is a function of macrostate". These two statements conflict, unless the gas somehow "knows" that it arrived unconstrained in the left half by statistical variation rather than partition removal, and we know it does not, right? I am saying that the second statement, not the first, is in fact always correct. PAR 15:39, 22 February 2007 (UTC)
In the case of the gas immediately after the partition has been removed, it's all on the left because it hasn't diffused yet. That's not the same as statistical variation for a gas that's in equilibrium throughout the whole chamber but happens to have all molecules on the left. They're in different macrostates. The equilibrium macrostate includes a very small probability that all molecules will be on the left, while the gas immediately after the partition has been removed has a much larger probability that all the molecules will be on the left. Different probability distributions means different macrostates, and therefore different entropies. —Ashley Y 21:37, 22 February 2007 (UTC)

So if I present you with a container in which the gas is all on the left side of the container, (but without a partition), you are saying that you cannot tell me what its entropy is until you know whether it is the result of a statistical variation or a very recently-removed partition? PAR 21:59, 22 February 2007 (UTC)

If you present me with a container in which you know that the gas is all on the left side of the container, then you're giving me a probability distribution and a macrostate. But that's not statistical variation. A gas that's in equilibrium in the whole container has a very small chance that all the molecules are on the left. That's a different probability distribution and so a different macrostate. All you can tell me is the macrostate, i.e. what the probability distribution of microstates is. The best you can do is present me with a container of gas at equilibrium and say that there's a very small chance that all the molecules are on the left. —Ashley Y 22:48, 22 February 2007 (UTC)
No, I can present you with a container in which all the gas is on the left side of the container, and therefore out of equilibrium. Its a clear, measureable, concrete situation. Here is a container, and we both can see that all the gas is on the left side of that container. We both know it because that's what it really is. That is all I am presenting you with. Are you saying that you cannot tell me what its entropy is until you know whether it is the result of a statistical variation or a very recently-removed partition? If not, what is the entropy? PAR 00:45, 23 February 2007 (UTC)
You can present me with a container of gas at equilibrium, and that will include a small chance that it's all on the left through statistical variation. But even if it is, it will still be in the "equilibrium" macrostate, with the corresponding entropy. If you want to determine whether all the gas has moved to the left through random variation, you will have to add some kind of detector. The detector itself will be at a particular temperature: if it is colder than the gas, then the whole system is not yet at equilibrium, and energy will move from the gas to the detector. If it is at the same temperature T, then it will be subject to its own thermal variation, which will give false energy reading fluctuations of at least the order of kT. Since this is also the order of energy of the molecules it is trying to detect, it cannot be used to detect single molecules.
Of course, if you know that all the molecules are on the left because you just removed the partition, that's a different macrostate. —Ashley Y 01:58, 23 February 2007 (UTC)
Its a simple question - If I present to you a container with all the gas on the left side of the container, how do I calculate the entropy? PAR 03:13, 23 February 2007 (UTC)
OK, you have presented me with a container. How do you know all the gas is on the left side? —Ashley Y 08:27, 23 February 2007 (UTC)

Its a glass container, the gas is bromine, which is a reddish-orange gas. We can see that it is all on the left hand side. PAR 16:25, 23 February 2007 (UTC)

OK, so now we have a new system, consisting of a glass container of gas, a light source powered by a battery, and a detector (your eye). Energy is being pumped from the battery to the light source and we are using that to obtain information about the gas. The information gained puts the gas into a new macrostate with reduced entropy, but in the process energy from the battery has been converted to heat, which is a gain in entropy of at least as much. —Ashley Y 23:21, 23 February 2007 (UTC)
Ok, but I ask again - Given this container, and this knowledge that the gas is in the left half of the container, how do I calculate the entropy? What is the entropy? I don't want detail after detail, just an outline of how to calculate an actual number which represents the entropy of this configuration. Lets assume it is an ideal gas for simplicity. PAR 00:16, 24 February 2007 (UTC)
How do you know all the gas is on the left side? Bear in mind entropy is all about information and probability. —Ashley Y 03:48, 24 February 2007 (UTC)
As I said before, we see it on the left side. Battery, light bulb, human eye, if you prefer. Now, for the fourth time I ask, what is the entropy of the system in which the gas is all on the left side of the container? PAR 15:17, 24 February 2007 (UTC)
But you're not talking about just a container full of gas. You're talking about a new system consisting of a battery, light, eye, and a container full of gas thus observed. I can tell you about the entropy of this new system, if you want. —Ashley Y 20:58, 24 February 2007 (UTC)

Yes, and please give some example numbers or at least some symbols, so I can better conceptualize what you are saying. For example, the entropy of an isolated ideal gas in a volume V with N particles, and a fixed total energy and therefore fixed temperature T at equilibrium is just S where

S=Nk\ln(V/N)+S_0(T)\,

where k is Boltzmann's constant, and S0(T) is some function of T, and therefore constant. I am saying that if all the gas is in one half of the container, the entropy is

S'=Nk\ln(V/2N)+S_0(T)\,

Which is just the same formula, but using V/2 rather than V. S' is smaller than S. As the gas expands, the entropy increases from S' to S. I am saying that this is a classical situation, the measurement process does not appreciably affect the gas. PAR 23:15, 24 February 2007 (UTC)

The measurement process itself adds entropy to the system. The entropy of our system is
S_{system} = S_{gas} + \Delta S_{measurement}\,
Sgas is the same as your S'.
S_{gas}=Nk\ln(V/2N)+S_0(T)=Nk\ln(V/N)-Nk\ln2+S_0(T)=S-Nk\ln2\,
As you pointed out, that's a drop in entropy (actually corresponding to N bits of information). But ΔSmeasurement is the term I've been trying to point out to you. It's the amount of extra entropy you end up with from the measurement process to verify that all the molecules are on the left. Remember, entropy is all about information. In order to verify that all N molecules are on the left, you will need more than N photons. And to distinguish them from black-body radiation from the gas itself, they must more than kT energy each. So just to make the measurements, we must take more than NkT of energy from the battery and turn it into heat. Since everything is at a temperature T, that represents an increase in entropy of more than Nk.
\Delta S_{measurement} > Nk\,
S_{system} > S-Nk\ln2 + Nk > S\,
So as you can see, the second law still holds. —Ashley Y 08:19, 25 February 2007 (UTC)
Just for the record, Bennett has shown that the measurement process itself need not require an increase entropy. But what must cost entropy is the clearing of a register so there is somewhere to put the result. Jheald 22:44, 25 February 2007 (UTC)
Ok, now I think I see what you are saying and this is very interesting to me. Tell me if this is an accurate paraphrasing of what you are saying.
A macrostate encodes the information we have about a system in terms of macroscopic variables (at a particular time). Let's say it's an isolated system in equilibrium in macrostate Φ. Then that macrostate does not change in time, because of two reasons - it is an equilibrium state, and we do not update our information about it in any way. Nevertheless, the macrostate we might find by measuring that system may not correspond to macrostate Φ due to statistical variations. Lets say that macrostate Φ' is in fact measured. Also, the measurement of the system destroys its isolation. We may measure the entropy of the system S(Φ') to be less than the equilibrium value in the original macrostate S(Φ), but the process of measurement results in an increase of entropy outside the system that is more than enough to assure that the second law is strictly true when applied to the original system and the measurement system combined. PAR 17:37, 25 February 2007 (UTC)
Regarding Jheald's comment on Bennett, right, I don't think the idea that the photons must be appreciably above the average black body energy is the important part, the important part is that the entropy increases. Maybe I misunderstood, but I thought that the recording process constituted a sort of increase in entropy since the "disorder" in the recording registers increased. Erasure converted this to "thermodynamic entropy". These are vague notions in my mind, so forgive me if I present them poorly. I'm just not sure about Ashley's idea that the entropy increase due to measurement is always greater than any conceivable entropy drop of the gas system due to statistical variations. To Ashley Y - do you have any references for this argument so I could read up on it? PAR 06:04, 26 February 2007 (UTC)
I have restored the "fluctuation" sentence to the introduction along with a reference to Landau & Lifshitz "Statistical Physics Part 1". The quote below from L&L (page 29) is part of a discussion of the second law for "closed" sytems, which, by L&L's definition, is what we are calling an isolated system. The quote is as follows:

...we must remember that in reality the probability of transition to states of higher entropy is so enormous in comparson with that of any appreciable decrease in entropy that in practice the latter can never be observed in Nature. Ignoring decreases in entropy due to negligible fluctuations, we can therefore formulate the law of increase of entropy as follows...

I do not by any means consider this discussion closed. All I have done is managed to find a reference from a class-A source which supports the statement in question. From Jheald's mention of Bennett, I have found where he says that there is no lower energy limit to a measurement, but Bennett refers to it being shown in detail "elsewhere" and I have not been able to find that "elsewhere" yet. If anyone has any pointers to that, please let me know. This statement negates part of Ashley Y's argument. As a sort of logical positivist however, trying to answer Ashley Y's (wrong) statements has caused me some head scratching, and I'm not fully happy with the results.

I believe L&L are wrong about a decrease in entropy being possible. However, I do not have a reference to hand on this. It would be helpful if they attempted to come up with a very simple system in which they thought the probability of a decrease in entropy were significant.

One might be able to do measurements at lower energy in general, but in this particular case there is no way to make ΔSmeasurement less than the decrease in entropy of the gas.

A macrostate is simply a probability distribution over microstates. It's all the information you have about what the microstate is. If the system is in equilibrium, then the macrostate is constant. If the system is not in equilibrium, then the macrostate will change over time even if you don't do any measurement. For instance, if the molecules were confined to one side of the box, and you remove the partition, you expect the molecules to move over to fill the box at some particular rate. If you do some measurements of individual molecules you might find out more information about the momentary state of the gas, which would give you a different macrostate, but that macrostate would move back towards equilibrium as the information were lost. —Ashley Y 19:09, 27 February 2007 (UTC)

I think I understand what you are saying. Do you have any references or documents at all for this point of view, because I would like to understand it better. Is there a more detailed proof that the entropy of measurement is more than enough to offset any statistical variation in entropy that is seen when the system (which is in equilibrium) is measured? PAR 00:43, 28 February 2007 (UTC)
Another reason why the photon explanation does not work: The gas in the "equilibrium" state could, by statistical variation, be confined to the left third of the container, in which case the entropy would be SNKln3 and the total entropy would be:
S_{system} > S-Nk\ln3 + Nk < S\,
which could be a net decrease in entropy.PAR 01:37, 28 February 2007 (UTC)
I think if you consider the number of photons necessary to verify the absence of particles in the rest of the box, you'll see it's many more than N. After all, the box might be filled with gas, but each photon passing through may just happen to miss any particular gas particle. So you would have to flood it with enough photons so that the probability of this happening is significantly less than the probability that all the gas is in one section of the box. —Ashley Y 01:49, 28 February 2007 (UTC)
Yes, I agree. There's a certain probability that a photon will make it through the gas which depends on density. If you have a particular distribution of photons making it through, you have to be sure that variations are due to a statistical variation of the gas density, not of the photons and that means you need many more photons than particles. But I think you could use photons that were much LOWER in frequency than the black body peak rather than much higher, as long as the wavelength was much larger than the container dimensions. That would fit in with Jheald's mention of Bennet's idea that there is no lower limit on energy needed to make a measurement. My problem is the entropy. If the energy of the photons can be very low, where does the entropy come from? PAR 06:21, 28 February 2007 (UTC)
A low-energy photon won't be absorbed. To detect opaque gas molecules, I think you have to get the energy in the correct range. —Ashley Y 09:22, 28 February 2007 (UTC)
Well, yes, but in the same sense a high-energy photon won't be absorbed. Actually a high or low energy photon will be absorbed, but with low probability, so you need more of them to get decent statistics. If you use photons near the black body peak, you need more of them to get decent statistics not because of the probabilities, but because of the interference of photons generated by the gas. But this still does not address the entropy problem. PAR 21:18, 28 February 2007 (UTC)
You can't use low-energy photons because they won't be absorbed. So you will still create entropy if you use photons to make measurements. —Ashley Y 21:34, 28 February 2007 (UTC)

Its not that cut and dried. If the gas is at temperature T, then it has a black body emission spectrum corresponding to that temperature. The maximum absorption wavelength is the wavelength of the maximum emission. As you go lower and lower in wavelength, you get less and less absorption (and emission). As you go higher and higher in wavelength, you also get less and less absorption (and emission). Thats Kirchoff's law. As you get to lower (or higher) wavelengths, you have to use more photons to get the same amount of information however, because the lower probabilities cause more noise. You might have a point if you are saying that the net energy needed may not decrease because the net energy is the product of the energy per photon times the number of photons. That is a straightforward calculation and I don't know what the answer would be. If I have time I will do it. This discussion bears on the Maxwell's demon problem as well. PAR 06:14, 1 March 2007 (UTC)

[edit] Entropy fluctuations

Starting a new subheading, because the discussion above shows no signs of being completion yet. Anyway, here's my take:

It's important to distinguish between the entropy of the whole universe, and the entropy of the system. In classical thermodynamics, we are used to writing

S_{\mathrm{tot}} = S_{\mathrm{sys}} + S_{\mathrm{R}}\,

where Ssys is a function of the macroscopic variables of the system,

S_{\mathrm{sys}} = S_{\mathrm{sys}}(T_{\mathrm{sys}}, V_{\mathrm{sys}}, P_{\mathrm{sys}} ...)\,

and SR is the entropy of the surroundings, or reservoir.

This all translates quite happily to statistical thermodynamics, if we use the Gibbs formula for the entropy, rather than the Boltzmann formula; i.e. if we use

S = -k \sum p_i \ln p_i

to allow that the state probabilities pi (for states of the system, or states of the reservoir) need not all be the same.

Now at equilibrium, we expect Stot to be maximised, and constant. But that doesn't mean that Ssys and SR need be constant. Their total will be constant; but individually, both entropies will have probability distributions. And this uncertainty is in fact realised over time: the entropy of the system does indeed fluctuate, as its physical macroscopic variables fluctuate. We can even predict just how much all those variables should fluctuate, using statistical mechanics.

So, if we take for our 'system' a small micro-volume of the bromine gas you were discussing; with the 'surroundings' as the remaining volume, then we expect the partial pressures and the entropy in the micro-volume to fluctuate. In fact, close to phase transition critical points, the fluctuations may become very noticeable indeed, and easily susceptible to measurement. (With, to reassure Ashley, the information gain represented by the measurement being vanishingly small compared to the size of the entropy fluctuations).


Summarising the take-home message: when we talk about entropy fluctuations, we're talking about fluctuations in the entropy of a system. These are exactly balanced by corresponding opposite fluctuations in the entropy of the surroundings.

So, if in an ideal gas we get slight localised hot-spots of temperature, pressure and entropy, these are exactly balanced by the slight reduction in the average temperature, pressure and local entropy in the rest of the gas; so that the total entropy remains maximised. Jheald 09:23, 25 February 2007 (UTC).

I think what Ashley Y is saying above goes beyond this. What I am saying is that if you have an isolated system, i.e. a container of gas in equilibrium, there is a non-zero possibility that the gas will momentarily wind up all on the left side of the container at some point in time, as a result of statistical variations. This is true for a completely isolated system. If I have a container with three particles, there is a non-zero possibility that all three will be on the left hand side of the container at some point in time. Just extend that to N particles. The entropy of a container with all the gas on the left hand side is lower than when it is more or less evenly distributed. The entropy of an isolated system can spontaneously drop.
What Ashley Y is saying has a quantum-mechanical flavor to it. If you have an isolated system composed of a gas at equilibrium, the macrostate of the gas is what it was last measured to be, which is, of course, an equilibrium state. The question of statistical variations has no objective meaning without a measurement process. The question of statistical variations is a question of the probability that a particular, perhaps non-equilibrium, macrostate (e.g all gas on the left side) will be measured when the measurement process is applied to the above mentioned equilibrium state. The question of what macrostate it is "really" in when unmeasured is improper. Upon measurement, we may find that all the gas is on the left hand side of the container (due to statistical variations) and say that the gas is now in a severely non-equilibrium macrostate. The combined (lowered) entropy of the measured state and the measurement apparatus after the measurement will be greater than (or equal to?) the the combined entropy of the equilibrium macrostate and the measurement apparatus prior to measurement.

[edit] Question

If the universe started as a point of compressed energy then wouldn't that point of energy count as completely entropic because the distribution of energy is completely consistent in a point? Or does it not make sense to talk about the entropy of a point?

[edit] Second law of thermodynamics vs. creationists

I would like to see the following quote (or, the information in it) included in this article. It simply and elegantly explains why evolution does not violate the 2nd law. --Thorwald 02:57, 13 April 2007 (UTC)

Another favorite of the theists is the second law of thermodynamics, or entropy. Savvy creationists have given up this as an argument against evolution, but it is still pulled out to argue for the existence of a creator. According to the second law, the total entropy, or disorder, of a closed system must increase over time. If the universe started as chaos, the theist argues, a miracle was needed to impose order upon it. On the other hand, if the universe was maximally ordered at the beginning of time, this could be interpreted as the signature of a perfect creator. But the cosmological evidence indicates that the universe began in a state of maximum entropy — and that the total entropy of the universe has been increasing ever since! This apparently contradictory state of affairs is explained by the fact that the universe is expanding, with the maximum possible entropy of the universe growing faster than the total actual entropy. Thus, the universe only appears to be getting more ordered, but this is only because there is more room to spread out the clutter. In short, no miracle, and hence no creator, is needed to explain the origin or current state of the universe.

I would want it substantially rewritten before appearing in an encylcopaedic article, this is written as debate, not as information. LeBofSportif 08:26, 13 April 2007 (UTC)

[edit] Clausius

"There are many statements of the second law which use different terms, but are all equivalent. (Fermi, 1936) Another statement by Clausius is: Heat cannot of itself pass from a colder to a hotter body." I think this statement needs to be qualified, as heat most certainly can and does pass from colder to hotter bodies. To take a macro example - if you place two stars of greatly different surface temperatures in the same neighbourhood, they will both be radiating energy and each will be subjected to incoming radiation from the other body. The radiation emitted by each body is proportional to the 4th power of its absolute temperature and is not dependent on the temperature of the other body. What we are usually interested in is the net gain or loss of energy from a body, but we shouldn't lose sight of the two-way traffic of radiant energy. Paul venter 05:24, 26 May 2007 (UTC)

Yes, I had run across this issue during my searches on the Web and am thoroughly puzzled by it. The transfer of radiative energy as a contradiction of the "heat never flows from a cold to a hot body" sounds faintly suspicious -- along the lines of those who claim the EPR paradox "proves" that faster-than-light communications are possible (when it doesn't). However, on the face of it I can't see any cause to disagree.
I am aware of the fact that an infrared imager can't pick up a body that's cooler than it is because the imager's own thermal emission drowns out any thermal emission from the target. This is why sensitive infrared imagers have to be cooled. I suspect this factoid may contain a wedge (hmm, maybe that's a bad term to use in a topic infested by Darwin-bashers) to drive out this seeming violation of the hot-to-cold directive. Does someone with a professional background in physics have any insight into this puzzle? MrG 4.225.210.207 01:24, 18 July 2007 (UTC)
The Clausius statement as I've encountered was in fact: It is impossible for a cyclic device to operate in such a way that the sole result is heat transfer from a cooler to a warmer body, and the statement attributed to Lord Kelvin is also rendered as the "Kelvin-Planck" statement: It is impossible for a cyclic device to operate in such a way that work is produced by extracting heat from a single thermal reservoir. Counting here and on the actual article, that gives us five different statements-- does anyone know what the actual words of the scientists involved were? siafu 16:21, 18 July 2007 (UTC)
Read Clausius' Fourth Memoir to understand the "by itself" part of the second law, this has to do with old caloric theory by Antoine Lavoisier in which caloric particles (heat) were indestructible and could pass through material unchanged and without affect to the energy of the system. Lavoisier's theory, however, could not account for heat generated by friction and this was what Clausius was addressing. --Sadi Carnot 09:18, 25 July 2007 (UTC)

[edit] Fix the intro

It's apparent from reading the introduction that there was a quotation from Rudolf Clausius that has since been removed from the article. Could one of the regular editors of this article either restore the quotation, or rewrite the intro so that it's not confusing? Shalom (HelloPeace) 09:38, 9 October 2007 (UTC)

[edit] Work without a temperature difference

This edit struck me as I read in on my watchlist. The edit itself is fine in the context of the paragraph concerned (heat engines), but I'm not sure that the underlying concept (that you can't do useful work without a difference in temperature) is particularly useful. What about chemical batteries? Are, you say, but they're not at equilibrium and they're not cyclical! But then neither is any other real system... Physchim62 (talk) 15:22, 17 November 2007 (UTC)

A few points and then I'll start to wander off before coming back to the article: The current version of the article properly never implies "you can't do useful work without a difference in temperature," so I think you missed the underlying concept. Work without a temperature difference (an isothermal process) involves thermodynamic free energy and the second law can safely be ignored...for a constant chemical composition. That's kind of why it makes sense to talk about heat engines so much in the article (in addition to historical reasons)...because at some point every heat engine needs to change its temperature, so the second law always applies to the most ideal case of Carnot. You're wrong that real systems aren't cyclical. Car engines are cyclical. Maybe you meant that real cyclical systems aren't reversible? I guess that's true for macroscopic systems. For example, we could take a battery, use the electricity to spin a flywheel, and then use that macroscopic kinetic energy to charge the battery back up again, and it would be a reversible cycle with absolutely no losses, so the second law wouldn't apply...if we ignored electrical resistance and friction which would increase temperature and would therefore require a cold sink for the heat to be removed...so in reality we would need to model a little heat engine that would have to be mixed into that system too and the second law would apply there also in reality. When chemical composition isn't constant, there can be cyclical "chemical engines" instead of "heat engines" that generate work from changes in chemical potential instead of from changes in temperature. These chemical engines can operate isothermally, but they're still irreversible because they have losses due to random mass transfer in the microscopic-to-macroscopic transition that are analogous to entropic losses that are considered in heat engines due to random heat transfer in the microscopic-to-macroscopic transition. Mathematically, the TS term in equation 2 in Exergy#Mathematical_description disappears and the N terms act like the S term. This equation also appears in the second law article later. I'm pretty sure that a particle physicist would describe this mass-transfer loss as an entropic loss, but it's entropy due to particle identity, interactions, and location instead of particle speed and location so it's separated out into a different term in the equation. To an engineer and non-particle physicists it looks like a wacky sort of second law that isn't due to entropy the way we usually consider entropy because of the primary importance of heat engines in utilizing second law concepts. Chemical potentials are useful to ChemEs because they separate out the kind of entropy that Carnot and mechanical and thermal engineers think about (S in Gibbs's equations). All those other nasty chemical-specific kinds of entropy that some particle physicist or molecular chemist might bug us about are rudely lumped together into the potentials. Chemical potentials and fugacities are like microscopic-to-macroscopic magic for ChemEs, and I can't write about them too much or the omega chi epsilon mafia might oxidize my home with thousands of tons of sulfuric acid. They're already after me for writing too much in that exergy article (looking over shoulder). Anyway, consideration of isothermal chemical cycles using chemical potential is way too complex for a Wikipedia article like this one. My main problem with the "Overview" section is that it kind of repeats what's already been said earlier in the article. Flying Jazz (talk) 01:25, 18 November 2007 (UTC)

[edit] Edge of Chaos

"It is conjectured that such systems tend to evolve into complex, structured, critically unstable "edge of chaos" arrangements, which very nearly maximise the rate of energy degradation (the rate of entropy production).[1]"

Here is its editing timeline:

I removed the quoted sentence above for the following reasons:

  • In the reference that is provided no page or section numbers are given; so we are left to assume that somewhere in the entire book, there is a justification given.
  • The reference book is written by a medical doctor, Stuart Kauffman (talk), not a physicist, chemist, or engineer - as would be expected in an article about a physical phenomenon that deals with the transfer of heat and the statistics of large numbers of particles.
  • In the original sentence (and the one that exists today), it is unclear what is meant by "such systems". Does the author (previous word deliberately left ambiguous) mean "systems that are cooling"?

ChrisChiasson (talk) 17:59, 23 November 2007 (UTC)

Note: the custom on talk pages is to add new sections to the bottom of the page, not the top.
Stuart Kauffman spent 12 years at the Santa Fe Institute. His credentials are solid.
I don't really see what your problems are here. The claims being made are (1) that an increase in entropy can sometimes be accompanied by an apparent increase in order (if you only look at macroscopic appearances) -- eg an emulsion of oil and water spontaneously separating into an oil layer and a water layer. What matters here is how much freedom there is at a molecular level. (2) That for systems through which there passes a steady flow of energy, from an external high temperature input source to a low temperature external sink, those systems can develop towards self-organising structures which tend to maximise the rate of entropy production.
Both of these phenomena occur in constant temperature systems. The previous authors did not mean systems that were cooling.
I've therefore reverted the article back to the state it had at 12:04, 23 November 2007. -- Jheald (talk) 19:20, 23 November 2007 (UTC)
I'll take this one issue at a time. So, for now, I am going to ignore your reversion of my citation requests which were made in edits separate from the "edge of chaos" sentence removal.
I am not saying the sentence is untrue. I don't know if it is true or false.
I am reserving judgement on the Santa Fe Institute and Kauffman's time there. Forgive me if I don't accept your assertion that his credentials are solid. If he studied (the mathematics of) entropy there, then maybe there isn't a problem with his credentials. However, I consider this question to be pertinent: what would a doctor know about entropy? After all, entropy is a rather abstract thing that is only rigorously defined when we speak of a system, its surroundings, and the heat, work, and mass interactions that take place -- are they training MDs in thermodynamics and physical chemistry these days?
With respect to the "edge of chaos" sentence, it still remains that the sentence is not referenced to page numbers in the book. Therefore, it is not attached to any specific "expert" claim within the book. Unless the thesis of the entire book *is* the "edge of chaos" sentence, then the reference doesn't apply. Following that, it would seem that the sentence should not survive a citation challenge and should therefore be removed. What say you?
BTW, constant temperature systems do not require zero heat transfer. Among other things, systems undergoing phase changes and chemical reactions can both consume or reject heat while remaining at constant temperature.
My main problems with the sentence itself include the phrases "critically unstable", and "very nearly maximize". If a structure is critically unstable, it would seem to me that it should break down. Note that the sentence does not say "stable" or "metastable". Also, why would something only "very nearly" maximize entropy production? AFAIK, systems always tend toward maximum entropy production, not "very near" maximum entropy production. Secondary problems include the fact that the original sentence mentions a conjecture but not the person that made the conjecture. Only later do we have a reference from some MD. Is the MD the person that made the conjecture? I am sure I could come up with more things that bother me about the sentence, but this is enough for now.
I have not given enough thought to your oil and water example to accept or reject it yet. There are at least three things to consider with such a system: (1) entropy of mixing/unmixing (2) friction heating as gravity pulls the immiscible fluids apart and they drag on each other (3) transfer of the friction heat out of the system (which will remove entropy from oil/water system). Perhaps the starting entropy of the oil/water emulsion could actually be higher than the entropy of the "oil floating on water" system. In that case, the human observation of more order in the separated system would be consistent with lower entropy. ChrisChiasson (talk) 02:16, 24 November 2007 (UTC)
The reason for the oil/water unmixing has nothing to do with gravity. It primarily has to do with the electric dipoles on the water molecules, which have to be arranged near electric dipoles of other molecules to avoid overall charge concentrations. The oil molecules don't have dipoles, so can't achieve this. Near an oil/water interface, if charge concentrations are to be avoided, the water molecules on the water side have to be arranged in a manner more co-ordinated with each other than the amount of freedom that they normally would have in liquid water -- i.e. their entropy is lower. But when droplets in the emulsion merge, there is less interface surface area - the low entropy region has got reduced, so the entropy of the system as a whole can increase.
Frankly, I'm surprised you don't know that. It suggests you know rather less about entropy than the average biochemist, so perhaps you should be more careful in who you accuse as not having the training.
BTW, fyi the Santa Fe Institute is a very high-profile research centre, that was founded by some of the sharpest physicists at LANL to look at complex systems, with Physics Nobel prize-winners like Murray Gell-Mann and Philip W. Anderson very much in its orbit. Trust me, these guys know about entropy - both mathematical and physical.
"Critically unstable" - the idea doesn't refer to stable or metastable states. Rather, the suggestion is that the system tends to become arranged in such a way that the detail of its future evolution becomes least predictable. The classic quoted example is a model of a pile of sand, with a new grain being added at regular intervals. The claim is that the sandpile tends to evolve to a shape where a new grain might trigger a landslide of any scale - so the sandpile is not just unstable, it is critically unstable. Yet after the landslide, the new sandpile will still have the same property - so there is a distinctive quality about the structure that has come into being, and persists. The detail of the structure is constantly changing, and depends on its immediate past history, and random fluctuations. So it is not in a single state which maximises entropy production. But all of the states very nearly maximise entropy production. That's the idea, anyway. Jheald (talk) 09:37, 24 November 2007 (UTC)
I don't think Wikipedia talk pages are a good place to quibble about the value of "credentials". ChrisChiasson, it shouldn't matter if an idea was published by a freshman theater major. If it expresses a clear and correct idea for the reader of a general purpose encyclopedia, it stays. If not, it doesn't stay. Jheald, you wrote, "an emulsion of oil and water spontaneously separating into an oil layer and a water layer" and when ChrisChiasson replied mentioning gravity, you went on about how interfaces have nothing to do with gravity. I'm assuming you must have changed your mind about what you thought you were discussing. Of course gravity is required for an oil/water emulsion to form an oil layer and a water layer since most people use the terms "an" and "a" to refer to "one." Without gravity, interfaces form, but does one water and one oil layer form like ChrisChiasson thought you were writing about (because it's what you wrote)? And by the way, why do oil and water stay mixed if they are degassed? See here. Like The Who sang, "The simple things you say are all complicated." But, hey, that's science. Anyway, I forgot who was arguing for what but I'm removing the sentence for (what I think) is a sane reason. It's poor writing about a valuable if somewhat ambiguous concept that does not belong in a section of the article where clarity and real examples are most important. This is a second law article. The main section is "Criticisms." The subsection is "Complex systems" that debunks the fallacy that self-organization runs against the second law with an appropriate reference to the self-organization article and a good concrete example of Benard cells. Following that up with a sentence that refers to the conjectural application of a useful metaphorical relationship with cellular automata is unencyclopedic and does not serve the reader because it needlessly adds conjectural possibility after a real example has already been given. It's the textual equivalent of eating a cloud after dining on a beef brisket. Throwing in the phrase "energy degradation" also makes it seem like the metaphorical but useful entity violates the 1st law of thermodynamics. It's not about facts or credentials or who really meant what. It's about trying to make a good encyclopedia article. I'm also removing the Kauffman reference. Not because it's wrong or bad but because it doesn't seem to be used to write a top-notch article on the second law. If Kauffman's text makes extensive use of Benard cells then I made a mistake by removing it and please return it. But there's no need to reference "edge of chaos" here. Flying Jazz (talk) 07:20, 27 November 2007 (UTC)
The actual policy is that statements be verifiable, of a neutral point of view, and that they contain no original research. My contention falls partly under verifiability, both because of the unspecific citation that was introducted by a different user, and because the author might be an unreliable source. ChrisChiasson 19:56, 1 December 2007 (UTC)
Everything about the separation described in Jheald's reply falls under step 1 of what I listed. BTW, NASA is going to do an oil/water separation experiment on Earth and in the International Space Station's microgravity. Let's see if they have layers form in the one on the space station. ChrisChiasson 19:56, 1 December 2007 (UTC)

[edit] The Big Picture

The Second Law has an infinite number of possible definitions, most of which lead to more confusion than less. I'm planning on editing the article with the definition: Without Being Acted Upon, Everything Degenerates to Its Most Chaotic State. I plan to lend a view of The Second Law, from outside the enclosed system of the Universe, because this is the only way to understand the direction of the time vector. Entropy has frequently been referred to, by scientists Einstein, Oppenheimer, Hawking and others, as the pointer of time. None of these men were able to succinctly illustrate, to the common person, what pointer of time means, or its ramifications for understanding how time is dependent upon entropy. Changing the directional attribute of entropy, the vector of time, while keeping the magnitude constant, does not cause time to point toward the past. The ramifications of The Second Law allow freedom for scientists to conceive of realistic interstellar travel possibilities. Digital processing systems can be programmed to simulate the formation of the Universe with entropy phased 180 degrees. The process of organization is much faster if the closed system of the Universe moves toward lower entropy. No one is exactly sure how much faster, but educated guesses for the length of time needed for the Universe to reach its present state, under conditions of entropy moving toward a lower state in the Universe, run between 10 seconds and instantaneously. This is evidence that A Being with sufficient Intelligence and Power could have ordered primordial chaos in six earth solar days.

I'm am surprised at the self imposed limitations some of the editors place upon themselves, when the information needed is everywhere one would look. The US Federal Government has a database with 5 million physics articles. And reading periodical literature isn't a bad idea, either. Every shred of knowledge is not on the world wide web. —Preceding unsigned comment added by AwesomeMachine (talkcontribs) 16:18, 30 January 2008 (UTC)

Well...no. There are not an infinite number of definitions of the second law. There would be an infinite number if we imagined some version of the second law based on a perspective outside of this universe because when we do that we can imagine any damn thing we want. We could imagine a second law that provides evidence of creation in six days and we could also imagine a second law that provides evidence of creation on the back of a giant cosmic turtle named Mwaggabwagga. I think it is wise for encyclopedia editors to place a self imposed limitation on themselves of describing and defining physical laws as they exist wholly within this universe. I recommend that you contribute to a Wikipedia that exists outside this universe where all the other editors will share your uniquely unlimited perspective. Unfortunately, I do not have the URL for that site. Flying Jazz (talk) 22:23, 30 January 2008 (UTC)

[edit] Poll & discussion: Would a hypothetical Trapdoor violate LoT?

Please consider a hypothetical microscopic Trapdoor device contained in a closed system at a fixed ambient temperature that successfully creates a pressure differential between two chambers by means of moving gas particles from one chamber to another chamber by means of a Trapdoor mechanism. On a macro scale the ambient temperature remains constant, but mind you microscopic temperature gradients always exist within the closed system. I am interested, percentage wise, how many believe such a hypothetical device violates LoT? PaulLowrance (talk) 16:48, 8 May 2008 (UTC)

This sounds exactly like the kind of perpetual motion machine which the second law indicates won't happen, statistically. "Violate" seems a rather legalistic term, "contradict" might be better. However, this being Wikipedia the talk page is about improving the article, and we need verification that the question has been posed by a reliable source as we can't include your original research or use this page to discuss it. Hope that helps, dave souza, talk 16:07, 27 May 2008 (UTC)
I think violate or violation is commonly used in the science community, but that's unimportant IMO. This topic seems relevant to the wiki article as such a violation/contradiction would affect the article. Anyhow, thanks for the input, but I don't believe "perpetual motion" is the appropriate term. On the wikipedia perpetual motion page it states, "Such a device or system would be in violation of the law of conservation of energy," A theoretical trapdoor device would merely convert thermal energy into usable energy. Understandably the term "usable energy" is a point of view in that what may be unusable to one person may be usable to another person. For example, it is well known that kTC noise causes the charge on a capacitor to vary over time, and thus at any moment in time there is a voltage across the capacitor, which is known as reset noise. Of course the capacitor voltage varies randomly between + and - polarity, but such a charge constitutes energy nonetheless in that one may discharge such a capacitor and thermal energy will begin to recharge the capacitor in a random fashion. Reset noise is an unpredictable transfer of energy. Another example is viewing a microscopic particle resting upon the surface contained within a gaseous atmosphere. Depending on the details, Thermal energy will periodically move such a particle, thus performing mechanical work. Again, this is a random event since the direction such a particle moves/slides is unpredictable. Although one could observe such an event and *wait* for the particle to randomly move/slide to a desired location, and thus such a person could say thermal energy performed usable work. Anyhow, the point is that energy is merely transfered in such a theoretical trapdoor device, not created or destroyed, and therefore such a theoretical device is not a "perpetual motion" machine according.--PaulLowrance (talk) 19:51, 27 May 2008 (UTC)

Sounds like Maxwell's demon to me. --Itub (talk) 17:20, 28 May 2008 (UTC)

The trapdoor is a well known Maxwell demon thought experiment. This topic is an old debate. Not all scientists are convinced either way. Some believe the information required by such a trapdoor balances out, and thus the trapdoor will not rectify thermal energy. I'm interested in what percentage of scientists believe a hypothetical Trapdoor violates LoT and how would such a violation affect LoT. Thanks.--PaulLowrance (talk) 17:56, 28 May 2008 (UTC)

Should we include Feynman's ratchet where the demon is essentially a ratchet?--PaulLowrance (talk) 18:22, 28 May 2008 (UTC)

How does your hypothetical interest in a possible poll of scientists improve this article? Looks like archive time.. dave souza, talk 19:40, 28 May 2008 (UTC)
The demon could well exist in your definition of temperaturePhyschim62 (talk) 17:41, 30 May 2008 (UTC)