Talk:Entropy/Available Energy
From Wikipedia, the free encyclopedia
Contents |
[edit] 2003
"The thermodynamic entropy S is a measure of the amount of energy in a system which cannot be used to do work" - i think this is wrong - the work that cant be done depends on the temperature too - the TS product matters. I have never seen such definition - where is it from - Terse.
- That sentence is from the original incarnation of the article by Tobias_Hoevekamp. It doesn't strike me as a particularly bad way to describe the entropy, although it is of course not absolutely accurate. It seems to me that the thermodynamic (as opposed to statistical) concept of entropy is difficult to define succintly. Can you think of a better description? -- CYD
- That depends on whether or not the system is in a heat bath... Phys 15:10, 22 Aug 2003 (UTC)
-
- Perhaps we can make it a bit more precise, by mentioning that. It is a good intuitive interpretation of entropy, but it could also be misleading. Terse 22:38, 31 Aug 2003 (UTC)
- Entropy and work (energy) are different things. They are related through temperature. Jellyvista
[edit] Work Clarification Still Needed
The opening sentence of the article states that "entropy in the context of thermodynamics, is a measure of the amount of energy in a physical system that cannot be used to do work."
In the body of the article, the only statement about work not being able to be done is "Once the system reaches this maximum-entropy state, no more work may be done." But this seems to refer to maximum entropy, not entropy before maximum.
So is entropy before maximum still a measure of work that may not be done? Sitearm 04:34, 28 July 2005 (UTC)
- The entropy is a measure of the energy that cannot be used to do work, exactly in the sense that when the entropy is maximized, no more work can be done. It's an indirect measure, because entropy has units of heat capacity, not energy. -- CYD
[edit] First paragraph rewritten.
I have rewritten the first paragraph for the following reasons.
1. It contained an error, stating "the entropy in the context of thermodynamics, is a measure of the amount of energy in a physical system that cannot be used to do work." This is incorrect; entropy is in no sense a measure of energy. A system can have high energy and low entropy (such as a set of bar magnets all lined up opposite to a magnetic field) or low energy and low entropy (such as a set of bar magnets all lined up in the same direction as a magnetic field). In both cases, you know the entropy is low, because the bar magnets are all lined up in a very organized fashion. However, you know that the bar magnets lined up opposite to the magnetic field have higher energy, because you could get work out by releasing the magnets from that position.
2. Because entropy is such an abstract concept, it is difficult to grasp it intuitively without a lot of experience (just as energy would be a difficult concept to grasp if we didn't talk about it all the time in daily life). However, I think the popular confusion over the concept is exacerbated by the fact that two different definitions are given, and they seem quite different -- because they are. I hope that my version of the new paragraph has better explained the relationship between the two definitions.
67.186.28.212 13:02, 23 August 2005 (UTC)
Stating that entropy is quality of heat is incorrect. Temperature may be considered quality of heat. Entropy is the corresponding quantity.
- Entropy is not the quantity of heat; heat is the quantity of heat! Entropy is defined (in the clausius definition) as heat transfer divided by temperature, hence is not the same as heat. Furthermore, the unit of heat is joules, and the unit of entropy is joules (or joule, as you like) per kelvin, and thus they cannot measure the same thing. 67.186.28.212 23:56, 21 September 2005 (UTC)
The unit is joule per kelvin, not joules per kelvin. This unit, by the way, is also an SI unit of ideal gas. It ought to have its own name, the clausius ; electric charge has got its own unit name, the coulomb, and not just joule per volt. The mole is related to the joule per kelvin by a conversion factor called the gas constant. This is bad design in the SI. The Boltzmann constant is the conversion factor between the unit 'joule per kelvin' and the gas quantum 'one molecule'.
- The bad design of SI is irrelevant to Wikipedia. Wikipedia's purpose is to report, not to present original research. 67.186.28.212 23:56, 21 September 2005 (UTC)
The analogy between electricity and thermodynamics is: Electricity Heat / gas quantity charge, coulomb=joule per volt entropy, joule per kelvin (=clausius) quality voltage, volt=joule per coulomb temperature, kelvin (=joule per clausius) quantum electron (=e coulomb) molecule (=k clausius) current ampere=coulomb per second clausius per second
The difference is that electric charge is conserved while thermodynamic charge, entropy, is created by all irreversible processes. Bo Jacoby 09:38, 14 September 2005 (UTC)
- You are correct in finding an analogy between voltage and temperature: they are both potentials, in a sense. I have never heard the term "quality" used to describe a potential, nor can I find this usage anywhere on the net ("quality of electricity" is is used on the net to talk about how steady and environmentally friendly the power supply is, but I cannot find it used to describe electric potential). Can you cite a peer-reviewed source to justify your usage of the term quality? 67.186.28.212 23:56, 21 September 2005 (UTC)
- The present introduction states: The entropy can also be understood as the "quality" of heat flowing between two bodies. So here you find the word "quality". I object to this use, however. It is a misunderstanding. I wrote the alternative introduction which nobody liked, but the idea is this. The money (dE) you pay for buying some stuff is the product of the price (T) and the amount (dS). dE=TdS. The amount is the quantity and the price reflects the quality. The analogy is that the energy (dE) you receive from a hot body is the product of the temperature (T) of that body, and the emitted entropy (dS). So temperature is a quality and entropy is a quantity. It is better that saying that entropy is quality, which is the present sad state of affairs. Bo Jacoby 14:14, 6 October 2005 (UTC)
Today I have replaced the introduction with a section called 'flow of heat'. It remains to adjust the rest of the article - it is no longer true that the Clausius approach is postponed until later. I'd like very much to hear your comments. Bo Jacoby 08:33, 15 September 2005 (UTC)
I used the letter C for electrical charge, but the article quantity of electricity says that it is called Q. So I changed it. Later in the article the letter Q is used for 'heat'. I think that should be changed. We have E for energy, so we don't need Q for heat. Bo Jacoby 09:53, 16 September 2005 (UTC)
- Heat is not the same as energy, so you cannot substitute E for Q. In the notation you propose, for instance, the first law of thermodynamics would be expressed dE=dE-dW, which is clearly nonsense (whereas the usual notation dE=dQ-dW shows the difference between dE and dQ very precisely). 67.186.28.212 12:00, 21 September 2005 (UTC)
[edit] Remarks on reverted introductory paragraph.
It is still not correct that entropy "can be understood as a measure of the amount of energy in a system that cannot be used to do work." (1) It has the wrong dimensions to be a measure of energy. (2) The definition given is more properly a definition of the *free energy* (or its negative, since you say energy *not* available to do work). If the purpose is to give a colloquial definition to give people an intuitive idea of what entropy is, it does no good to give them an intuitive idea of what free energy is, and tell them that's entropy. (3) Item 1 under "first paragraph rewritten" at the top of the talk page gives an example of how the energy available to do work is *completely uncorrelated* with the entropy -- the energy available to do work can be either high or low, for the same entropy. P.S. I am not defending the edits made by Bo Jacoby. 67.186.28.212 15:02, 22 September 2005 (UTC)
- It is a measure of, not equal to the energy that cannot be used to do work. See the last paragraph of "Entropy change in irreversible transformations":
-
- For example, consider a gas enclosed in a piston chamber whose walls are perfect thermal insulators. If the pressure of the gas differs from the pressure applied to the piston, it will expand or contract, and work will be done... the entropy of the system will increase during this process... Typically, there exists a maximum amount of entropy the system may possess under the circumstances. This entropy corresponds to a state of stable equilibrium.. Once the system reaches this maximum-entropy state, no more work may be done. -- CYD
-
-
- Here is a counter example. Let us consider the situation of a cylinder whose walls are not thermally insulating. For simplicity, let us consider a quasi-static process, so dS=dQ/T. As long as the piston is being pulled out (dV>0), the gas does positive work p dV on the environment; but the change in the entropy of the gas can be *either positive or negative,* depending on whether we are adding or removing heat from the gas through the cylinder walls (dS=dQ/T, so can have either sign, depending on whether dQ>0 or dQ<0). Thus, there is *no correlation* between the amount of work done by the gas, and the entropy of the gas. For the *same work,* the change in the entropy of the gas can be *positive or negative.* [There is no problem with this argument if the process is slightly irreversible; dS>dQ/T, but dS can still be negative if dQ is large and negative enough.]
-
-
-
- With regard to the last sentences you quoted from the article: it is not correct the system will come to equilibrium when the entropy *of the system* is maximized. The correct statement is that the system will come to equilibrium when the *total* entropy of the system *plus the environment* is greatest. Now the *free energy* of the system is minimized in equilibrium; the derivation of this statement follows mathematically from maximizing the *total* entropy of the system plus the environment.
-
-
-
- I don't have my books with me at the moment, but I'll cite chapter and verse number tomorrow morning. 67.186.28.212 02:17, 29 September 2005 (UTC)
-
-
-
-
- Yes, I goofed; obviously, the system has to be not only thermally, but also mechanically insulated if the maximum entropy state is the state of stable equilibrium. But I think I know how to fix the problem. -- CYD
-
-
-
-
-
-
- For example, consider an insulating rigid box divided by a movable partition into two volumes, each filled with gas. If the pressure of one gas is higher, it will expand by moving the partition, thus performing work on the other gas. Also, if the gases are at different temperatures, heat can flow from one gas to the other provided the partition is an imperfect insulator. [...] Once the system reaches this maximum-entropy state, no part of the system can perform work on any other part. It is in this sense that entropy is a measure of the energy in a system that "cannot be used to do work".
-
-
-
-
-
-
-
- I find your new formulation still to be problematic.
- (1) Your example still does not illustrate any special connection between entropy and work. For instance, in your example, once the system has reached equilibrium, no amount of *heat* can be exchanged beween one part and another, either. Thus, you could equally well have said, "In this sense, the entropy is a measure of the energy in a system that cannot be used to exhange heat." The fact is that in equilibrium, the entropy will be maximized *with respect to any and all degrees of freedom.*
- (2) Once the system is put back into contact with the environment, it will be able to do work on the external environment. So what you mean is that when a system's entropy is maximum, there's no energy availble to do *internal* work. Thus, the sense in which there's 'no energy available to do work' is *quite* a limited sense.
-
-
-
-
-
-
-
- Thus, I have now provided two examples illustrating a quantitative lack of correlation between entropy and the energy available to do work, even to the extent of having opposite signs; and you have provided an example in which the relationship between entropy and work is the same as the relationship between entropy and heat. I repeat that the correct statement is that the *free energy* is the measure of the energy available to do work.
-
-
-
-
-
-
-
- If you want a fundamental thermodynamic (as opposed to statistical mechancial) definition of entropy, it is Clausius's: dS=dQ/T. The colloquial interpretation of this equation is that heat coming from a high-temperature source (or more generally, energy with low entropy) is more useful for performing work. As you know, the energy conservation problem is actually a problem of entropy conservation; energy itself is neither created nor destroyed, but it can be converted to high entropy forms, which is less useful. (Goodstein, "Out of Gas. The End of the Age of Oil.")
-
-
-
-
-
-
-
- I'm sorry that I haven't given you references yet to support my other statements. I will do so tomorrow. Can you in return cite a source that supports your contention that entropy is a "measure of energy" in any sense whatsoever? 67.186.28.212 18:40, 2 October 2005 (UTC)
-
-
-
-
-
-
-
-
- Could you try editing the article directly? I agree fully with your points; the difficulty is phrasing it in a single sentence in a manner that's comprehensible to the lay reader. (Btw, it is misleading to talk about "energy with low entropy", because entropy is a function of a system's state. It is an independent thermodynamic variable.) -- CYD
- 'Energy with low entropy' is sunshine, or electrical power, or generally: heat flowing at a high temperature. dE=TdS. Bo Jacoby 08:17, 6 October 2005 (UTC)
- Could you try editing the article directly? I agree fully with your points; the difficulty is phrasing it in a single sentence in a manner that's comprehensible to the lay reader. (Btw, it is misleading to talk about "energy with low entropy", because entropy is a function of a system's state. It is an independent thermodynamic variable.) -- CYD
-
-
-
-
-
-
-
-
-
-
-
- That's a circular argument, because then you have to define what temperature is, and temperature is a function of a body's state. -- CYD
- Entropy and Temperature are defined together. It is tricky, but not circular. The sun is hot and the night sky is cold. So the flow of energy from the sun via the earth to the universe has increasing entropy. The temperature of the sunshine is about 6000 kelvin, so the entropy flow from the sun is (1/6000 joule per kelvin) per joule. This is energy with rather low entropy. The temperature of the earth is about 280 kelvin, so the entropy radiated away from the earth is (1/280 joule per kelvin) per joule. This is energy with somewhat higher entropy. Bo Jacoby 13:34, 6 October 2005 (UTC)
-
-
-
-
-
-
-
-
-
-
-
-
-
- They are defined together as properties of a system, not properties of "heat". -- CYD
- Temperature is a property of the system. The entropy increment dS=dE/T is defined by the transfered energy dE. (I'm not talking about "heat"). The entropy content of a body is defined as the integral of entropy increments. This leaves the zero point undefined. Bo Jacoby 14:26, 6 October 2005 (UTC)
-
-
-
-
-
-
-
-
-
-
-
-
-
- You're using a nineteenth-century (pre-Boltzmann, pre-quantum mechanics) understanding of entropy. See Third law of thermodynamics. -- CYD
-
-
-
-
-
-
-
-
-
-
-
-
- Yes. Macroscopic physical understanding is still valid. The definitions of absolute temperature and of entropy depend on it. The constants of Boltzmann, Wien, and Planck, do not appear in these definitions. They may be derived by thermodynamic measurements, but not by thermodynamic theory alone. Macroscopically dS=0, dT=0, dE=0, and even T=0 are well defined, but S=0 and E=0 need additional definitions. -- Bo Jacoby 08:57, 7 October 2005 (UTC)
-
-
-
-
-
-
-
-
-
-
-
-
- The article is about entropy, not "entropy as understood in the ninteenth century, willfully ignoring all subsequent work". -- CYD
-
-
-
-
-
-
-
-
-
-
-
-
- The section Entropy#Thermodynamic_definition_of_entropy is about "entropy as understood in the nineteenth century, willfully ignoring subsequent work". The discussion above is definitely not about subsequent work. The question was "energy with low entropy" and my answer was sunshine. That is true irrespective of subsequent work. Subsequent work might call it "energy with few photons". Bo Jacoby 13:37, 7 October 2005 (UTC)
-
-
-
-
-
Well, I see I'm opening an old argument, but the first sentence is not right, or is at least misleading. The common usage of the term "energy availiable to do work" means the maximum amount of work you could get a system to do to its surroundings. Entropy is not a measure of that. I'm not even sure if its a unique measure of the amount of work that can be done internally. There are two definitions of entropy:
- The statistical mechanical definition - S=k ln(W) where W is the number of microstates that a macrostate may have.
- The thermodynamic definition - dS=δQ/T for a reversible transformation.
Then there is a statistical mechanical argument that connects the two. That seems like a lot to put into an introductory sentence, but we should at least put something that is correct. For the thermodynamic entropy maybe something like:
Entropy is a thermodynamic quantity that is a measure of how much a system has "equilibrated". Differences in temperature and pressure for example, tend to drive systems to have uniform temperatures and pressures, and the more equilibrated it becomes, the more its entropy increases.
- The previous critique was mostly on the lines that entropy doesn't have the dimensions of energy; and so it doesn't. But saying that STR is that part of the current internal energy which would be the very minimum unavailable to do work at a temperature TR, no matter how well you built your heat engine, is (I believe) correct. E - STR is the simplest form for the thermodynamic function called availability.
- The thermodynamic definition is more general. But I think the availability definition is right for the opening sentence, because it gives something more operationally tangible that readers can hang onto, as their first introductory answer to the question, "why is entropy important, and why should anyone be interested in it?" That's why I think most dictionaries go with this (or rather, a less specific version, usually missing out the important TR) as their first capsule definition.
- (Note: a more complete formula for the availability is
- A = E - STR + PRV - μRN, or
- dA = dE - TRdS + PRdV - μRdN,
- when one considers the energy that the system could gain from an external pressure reservoir, or will not give up to an external particle reservoir. But if we are just considering what minimum amount of the particle's current internal energy is not available to do work, then the definition as presented is (I believe) correct. And I think it gives the best first sense of what entropy is, and why one should be mindful of it. Note that dA <= 0 for any internal change. Reference: Waldram, Theory of Thermodynamics, p. 75).
- I think it's advisable, as far as possible, to do the thermodynamic perspective first -- ie the limiting case when the the thermodynamic limit applies, fluctuations are negligible compared to the average quantities, the macrosopic variables act as an autonomous dynamical system, and the second law is a fact not a tendency.
- It's important, IMO, to set out thermodynamics as a self-contained theory that applies in this limit, which doesn't rely on statistical mechanics. This is helpful for understanding, IMO.
- Then statistical mechanics can be introduced as a more general, more detailed explanation of what is going on, which doesn't rely on the thermodynamic limit.
- I believe that there is much more clarity, and much less chance for confusion, the more that the reader can appreciate that there are two distinct pictures here. The more separation we can get in the article between the two pictures the better, IMO, so the newcomer appreciates most readily when they are stepping from one picture to another.
- That is why I am, as far as possible, against muddling in the microscopic view of entropy as a measure of mixedupness/disorder/uncertainty until the thermodynamic picture has been presented separately, as intuitively as possible. And why there should be clear signposting "you are now leaving the thermodynamic sector" whenever the transition of viewpoints is made.
- Failing to keep the two pictures sharp and distinct is, IMO, one of the things that can most muddle people and make them feel uncomfortable or unsafe about the subject.
- Casually blurring together explanations of entropy from the two pictures is IMO not helpful. -- Jheald 10:56, 24 November 2005 (UTC)
-
- I totally agree that the distinction between thermodynamic entropy and the statistical mechanical explanation of entropy should be kept distinct. However, I still have a problem with the idea that TS is the energy unavailiable to do work. A=E-TS is the Helmholtz free energy and for a process going from state 1 to state 2, the change is ΔA=-PΔV as long as the initial and final temperatures are the same, and the number of particles is the same. That means that -ΔA is the amount of work you can get out of the system at constant T and N, going between those states. However, A is not the amount of work you can get out of the system. If you take a system and maintain it at temperature T, you can get an infinite amount of work out of it, so the idea that -A is the amount of work you can get is wrong. Take for example, an ideal gas. A is of the form A=NkT(Φ+ln(V)) where Φ is a function of N and T only and V is volume. You can see that A is finite at volume V1 but the work availiable between the states 1 and 2 is NkT ln(V2/V1) which is infinite if you let V2 go to infinity. The fact that -A is not the work availiable means TS is not the work thats not availiable.
-
-
- It's true A can become negative, so -ΔA can be larger than A. On the other hand, -ΔA is only going to be infinite if you are expanding against a zero external pressure. If the external pressure is finite, -ΔA will also be finite. For a given end state, and a given initial internal energy, the fact remains that the higher the initial entropy, the less free energy is available to do work. -- Jheald 00:38, 25 November 2005 (UTC)
-
-
-
-
- But then entropy is not a measure of unavailable, its also a function of pressure. A is a measure of B means that there's a one-to-one correspondence, if you have so much of A today, and so much of B today, then you have so much of A tomorrow, you will again have the same amount of B.
-
-
-
- The above holds no matter what you define the "availiability" as. From the above statement that A=E-TS+PV-μN as the availiability, the differential statement following that statement is wrong, the derivative of TS is TdS+SdT for example. PAR 16:54, 24 November 2005 (UTC)
-
-
- The differential is actually right, because TR, PR and μR in the definition of Availability are all assumed to be fixed properties of the external surroundings, so there are no terms like SdTR. Availability can be thought of as a generalisation of free energy, because the temperature in the definition is the external temperature TR, rather than the current temperature. It's actually a generalisation of the Gibbs free energy, because the Availability function includes the term PRV so that differences in Availability measure the "useful" work, rather than the total work available; any work done merely to expand against the external pressure is not considered useful. -- Jheald 00:38, 25 November 2005 (UTC)
-
-
-
-
- Ok, I didn't realize the R meant the surroundings. But this is getting really confusing. We have to set up a concrete situation. So far, I understand that there is a reservoir at temperature TR, pressure PR, chemical potential μR. By reservoir is meant the surroundings are so huge that they are unaffected by the system in question. There is a system, I'm not sure of its initial or final state, could you specify that. Also, I'm not clear on how the system is connected to the reservoir. There is no particle transfer, right? But there is heat transfer and work transfer, in other words the system is thermally and dynamically connected to the surroundings. Is that right? Are you saying that the entropy of the system in its initial state is a measure of the fraction of the internal energy of the system which cannot be converted to work in going to some final state, when it is constrained to have the same temperature and pressure as the surroundings? I mean, I'm trying to get a very clear idea of what you are saying. PAR 01:41, 25 November 2005 (UTC)
-
-
-
-
-
-
- Yes, final state (when maximum work has been done) is full equilibrium with the reservoir. Initial state is whatever you like; but the easiest case would be a piston of gas at temperature T and pressure P. Depending on possible different temperatures of the reservoir, I'm claiming that the (minimum) amount of energy unavailable to do work varies as TR S, if you were to vary TR. But I may need to go away and think about this some more. -- Jheald 08:52, 25 November 2005 (UTC).
-
-
-
Lets just consider the initial state to be at the reservoir temperature. Then the total amount of work you can get out of the system is the change in the Helmholtz free energy δW=ΔA which is just δW=ΔU-TΔS. For an ideal gas, ΔU=0 at constant temperature, but this is not true in general. Even the change in entropy at constant temperature is not a measure of the unavailable work. PAR 11:24, 25 November 2005 (UTC)
- I found a rather nice and very clear presentation of the subject in the beautifully written 1911 Britannica article on "Energetics". Unfortunately, although there is a version of the article online [1], it is almost unreadable because the OCR software couldn't cope with the formulae. The relevant section of the article starts "Available energy". I almost got things right, but not quite!
- The answer is that if we suppose that there is available a reservoir of unlimited heat capacity at a temperature TR, and we ask how much of the internal energy of our system is available for mechanical work, in a process which reduces the whole system to the temperature of this condenser, we find that, relative to this condenser, a general state of the system C contains more available energy than a general state of the system D by an amount EC - ED - TR(SC-SD); where neither C nor D need necessarily be at temperature TR, nor indeed need even have a single system-wide temperature.
- So while it is not true that TRS represents the energy unavailable to do work in absolute, it is true that TRΔS represents the energy unavailable to do work out of any change in internal energy ΔE. -- Jheald 09:21, 27 November 2005 (UTC)
Ok - the part I disagree with in the above is that C and D can be any temperature. They must be at the same temperature TR, and the whole transition from D to C must be done at temperature TR.
Its hard for me to explain it without some PV (pressure-volume) diagrams so I uploaded these quick sketches:
In all of them, there is an isotherm labelled TR. The area under any path is the amount of work done (the integral of PdV).
- Case 1 - the shaded area is your ΔW= EC - ED - TR(SC-SD). You can see this because by the first law, the work done is the integral of PdV from point D to C along the TR isotherm
- and integrating, from D to C with TR outside the integral, its easy to see.
- Case 2 - If you don't stay on the isotherm, the shaded area changes, that means the amount of work done changes. Going from D to C I can get any amount of work I want, if I pick the right path.
- Case 3 - If C and D are not on the isotherm, but I choose this path where they drop directly down to the isotherm, doing no work, its still not right because the shaded area is ΔW= EC' - ED' - TR(SC'-SD'), which is not the same as EC - ED - TR(SC-SD).
Again, the only way ΔW= EC - ED - TR(SC-SD) is true is if C and D and the entire path between lies on the TR isotherm.
During an isothermal process, a certain amount of energy ΔE may be withdrawn from a system. ΔE-TRΔS is the maximum amount of that energy that can be withdrawn as work. TRΔS is the minimum amount that cannot be withdrawn as work, during that isothermal process.
- The argument goes like this. Suppose that there is a small quantity of heat δQ at a temperature T somewhere in the system, that we want to use to do work. Removing that heat from the system will cause a fall in entropy of dS = δQ/T. For the process to be possible, the entropy of the surroundings must increase by a corresponding dS. So we must dump a minimum heat energy of at least TR dS = TR δQ/T into the reservoir. This is the energy unavailable to do work; the energy which is available to do work is the remainder of δQ, and the non-heat energy of the system. Integrating over the whole path from state C to state D thus gives a total energy TR ΔS unavailable to do work.
- Turning to the three cases you've drawn. Obviously, no problem with case 1, though insisting T=TR is an unnecessarily limiting to a particular special case. Case 2 is impossible because you're getting work out of nowhere: where is the energy to do the work above the isotherm coming from? It's not the system, that's already accounted for in state C; but it's not the surroundings, because heat from the surroundings isn't going to be available at a temperature above TR. Case 3 is unnecessarily inefficient. I'm not saying that the energy that is available to do work is the same as on the isotherm TR -- what I'm saying is that the energy that isn't available to do work is the same as that on the isotherm. This is a very different statement.
- Note that for the high temperature cylinder + low temperature surroundings/condenser, TR ΔS <∑ TδS -- the presence of an infinite capacity low temperature condenser reduces the energy which is unavailable for work, as compared to say a solely high-temperature isothermal process. -- Jheald 23:00, 29 November 2005 (UTC)
Ok, if you don't mind, I would like to get very specific here, because there is a lot I am missing in the above argument.
My first objection, although I'm not sure if it is the cause of the problem, is this: Can we use dU instead of δQ for an amount of thermal energy. δQ measures the amount of energy transferred by a heating process, but doesn't refer to a resident quantity of energy. Heat and work are very similar concepts. Any where you use "heat (verb)", "temperature" and "entropy" you should be able to replace it with "do work", "pressure" and "volume" and have a sensible statement (which may be wrong, however). So the phrase "Suppose that there is a small quantity of heat δQ at a temperature T somewhere in the system, that we want to use to do work" makes as much sense as saying "Suppose that there is a small quantity of work δW at a pressure P somewhere in the system, that we want to use to heat." What you mean to say is that "Suppose that there is a small quantity of thermal energy dU at a temperature T somewhere in the system, that we want to use to do work." The two are equivalent only if the volume is held constant, and I'm not sure if this implicit assumption of PdV=0 isn't screwing things up.
- When I wrote the above paragraph, I did actually write energy instead of heat in my first edit, even though 1911 Britannica says heat; but then I re-edited it and corrected it. The important point for the argument is that removing the energy causes the entropy of the system to drop by dE/T. That is not true for energy accounted for in macroscopic degrees of freedom, ie changes PdV; it is only true for energy dispersed into microscopic degrees of freedom.
- There is no assumption PdV = 0. In fact, almost the opposite: none of the energy in the macroscopic degrees of freedom is unavailable to do work in an isentropic expansion. -- Jheald 09:45, 30 November 2005 (UTC).
Second, can we make the situation specific by saying that at the beginning there is a system with parameters U, T, P, V, S, etc. which is thermally and mechanically connected to a reservoir with parameters UR, TR, PR, VR, SR, etc. Thermally connected means it can transfer heat, mechanically connected means it can do work on the reservoir by expanding into it. Lets suppose the system is hotter than the reservoir and it expands into the reservoir. After a small amount of time, the system is at U-dU, T-dT, V+dV, S-dS, etc and the reservoir is at UR+dU, TR, VR-dV, SR+dSR. Note that energy and volume conservation have been implied since the same dU and dV occur on both sides.
Before I analyse this system, I just want to make sure that it is a particular case of what you are talking about. Can this setup be used to show what you are saying? PAR 06:37, 30 November 2005 (UTC)
- Yes. Some of those assumptions are stronger than necessary, but that would certainly be a relevant particular case. -- Jheald 09:45, 30 November 2005 (UTC).
Ok, another aspect I don't understand is then, why cant you simply thermally insulate the system from the reservoir and let it expand by dV, assuming its pressure is greater than that of the reservoir? That will be adiabatic, no change in entropy, the entire dU lost by the system will be converted to work. Or, if you prevented the system from expanding, the entire dU lost by the system would be transferred as heat. It seems to me there's no absolute relation between dS and the amount of work you can or can't do. PAR 16:09, 30 November 2005 (UTC)
- For your first example, if two subsystems are adiabatically isolated, then one cannot do any work on the other, so by letting one expand aren't you explicitly saying that the two are not thermally idolated? (I could be wrong.) —BenFrantzDale 16:57, 30 November 2005 (UTC)
-
- Note: thermally isolated does not necessarily imply mechanically isolated. — Jheald 18:23, 30 November 2005 (UTC).
- PAR: If you recall, my claim now is that out of energy ΔU given up by the system, an amount at least TRΔS must be given up as heat, and so be unavailable to do work. This encompasses the example you've given: for a reversible adiabatic expansion, with no change in entropy, the entire ΔU lost by the system will indeed be available to do work. But something like a Carnot cycle includes isothermal steps, as well as adiabatic ones. In the Carnot cycle context, having isothermally injected heat and increased the entropy an amount ΔSAB by moving from state A to state B both at temperature T, one then wants to ask how much work one can get out of the system, returning from B to A by a different route. Answer: all but TR ΔSAB. — Jheald 18:23, 30 November 2005 (UTC).
Then the statement is now "In a Carnot cycle, out of energy ΔU given up by the system, an amount at least TRΔS must be given up as heat, and so be unavailable to do work." That I agree with. PAR 03:54, 2 December 2005 (UTC)
I struck out the above statement, because ΔU is obviously zero in a Carnot cycle. What you are basically saying now is that the statement "unavialiable to do work" has no meaning except in the context of a Carnot cycle. If there is an increment of internal energy ΔE, all of it can be turned into work, and therefore there is no part of it that is "unavailiable".
I think what is being confused here is the total amount of heat absorbed by a system and its internal energy. It is clear that if thermal energy ΔQ is absorbed by a system, then all of that absorbed energy except TRΔS can be converted into work in the process of restoring the system to its original state. But this present formulation is far from the original statement, so I have replaced it with a more general idea of entropy, until we can figure out something better. PAR 20:02, 3 December 2005 (UTC)
- PAR, I fail to understand what you object to in the following statement:
-
- There is an important connection between entropy and the amount of internal energy in the system which is not available to perform work. In any process where the system gives up an energy ΔE, and its entropy falls by ΔS, a quantity at least TR ΔS of that energy must be given up to the system's surroundings as unusable heat. Otherwise the process will not go forward.
- It seems to me that that statement is exactly correct. It is correct in the context of the return leg of a Carnot cycle. It is correct when we are considering any general change in the system, not as part of a Carnot cycle. It is correct when TR=T, as a definition of Helmoholtz Free Energy. It is correct whenever we can consider the system surroundings as being at a well-defined temperature TR, or whenever the system is in contact with a heat sink into which it can dump entropy at a temperature TR. It's basically a quantification of the Kelvin statement of the Second Law. I simply don't understand where your problem is with the statement. -- Jheald 20:44, 4 December 2005 (UTC).
The best way I can think of to explain my objection is to come up with examples.
- If I have a system in state A, and I add energy to it, extract work from it, etc., until it winds up in state B, then I agree that the most work you can get from it is equal to the difference in the internal energy plus the amount of energy added minus TRΔS, as long as the system is not allowed to go below TR in temperature. But the statement says
-
- "at least TR ΔS of that energy must be given up to the system's surroundings as unusable heat"
- Am I nitpicking to say how do we know that the TR ΔS refers to that energy and not some other energy that has been added to the system?
- I could get the system to a temperature below TR by allowing the system to adiabatically expand. Then, if I was careful, I could isothermally compress the system at that sub-TR temperature. Effectively I could go around the minimum TR restriction.
I would say that if we have a system that goes from A to B, without interacting with any other system except a reservoir at TR, and is not allowed to have its temperature drop below TR, then only ΔE-TRΔS of the change in internal energy can possibly be made to do work. I remember studying the "availiability of energy" and I will try to find it in some reference. If you have a reference, please let me know. PAR 12:45, 5 December 2005 (UTC)
- You're not going to be able to isothermally compress the system at a constant temperature below T_R without using a heat-pump to keep it refrigerated; and further external energy will have to be dissipated as heat to work the heat pump.
- The point is, that if the entropy of the system falls by ΔS, then the reservoir is going to have to gain heat T_R ΔS from somewhere. You could claim that this was coming from somewhere else, and not out of the work being done using the energy lost from the system -- but I think insisting on such hair-splitting would be disingenuous: when we are talking about the energy available to do work, it is surely the net amount of energy/work available we understand. -- Jheald 21:18, 6 December 2005 (UTC).
-
- Well, I don't want to go so far as to say you are right, but I doubt my own argument enough now that I put back your explanation. There's something that still bothers me about this that I can't explain, but until I can, I shouldn't be ejecting your statement. If what is bothering me turns into a valid argument, you will be hearing from me :) PAR 09:35, 7 December 2005 (UTC)
[edit] Entropy and free energy
I recently edited this page to include the ideas of free energy. However the changes were reverted. Is it not true that entropy is related to "trapped energy" or energy that is not free energy? The product given at the top of the page, S*TR is supposed to be the energy "unavailable to do work" this *is* the energy that isn't free energy. Entropy is a very misunderstood topic and gave me hell for trouble in chemistry - disorder is an awful way to describe entropy because it leads to so much confusion. In any case, I believe it is important to drive home that entropy is *not* disorganization - but is directly related to the energy in a system, namely the trapped energy. Please discuss the addition of the concept of free energy to this article. Fresheneesz 20:16, 20 November 2005 (UTC)
- Well, I didn't do the reverting, but I think maybe a short section on the subject would be better. Free energy when precisely defined is not an intuitively obvious concept, so references to it seem cryptic to the less informed. PAR 22:11, 20 November 2005 (UTC)
Ideally the free energy article would allow a reader to easily understand the concept. Entropy itself is very cryptic to even very informed persons. My goal is to make entropy a subject that is easily understood on wikipedia - and I think tying the concept to free energy helps do that. Fresheneesz 22:19, 20 November 2005 (UTC)
- Should it go in the free energy article, or the individual Helmholtz free energy and Gibbs free energy articles? PAR 23:11, 20 November 2005 (UTC)
- I think it should go in this article - after all, it would be about entropy. Saying something like "entropy is closely related to the energy in a system that is not free energy" belongs in the artcle about entropy, not free energy. 68.6.112.70 23:25, 20 November 2005 (UTC)