Fundamental thermodynamic relation
From Wikipedia, the free encyclopedia
Laws of thermodynamics |
Zeroth Law |
First Law |
Second Law |
Third Law |
Fundamental Law |
In thermodynamics, the fundamental thermodynamic relation is a mathematical summation of the first law of thermodynamics and the second law of thermodynamics subsumed into a single concise mathematical statement as shown below:
Here, E is internal energy, T is temperature, S is entropy, P is pressure, and V is volume.
[edit] Thermodynamic derivation
Starting from the first law:
From the second law we have for a reversible process:
Hence:
By substituting this into the first law, we have:
Letting dW be reversible pressure-volume work, we have:
This has been derived in the case of reversible changes. However, since U, S and V are thermodynamic functions of state, the above relation holds also for non-reversible changes. If the system has more external variables than just the volume that can change and if the numbers of particles in the system can also change, the fundamental thermodynamic relation generalizes to:
Here the Xi are the generalized forces corresponding to the external variables xi. The μj are the chemical potentials corresponding to particles of type j.
[edit] Derivation using the microcanonical ensemble
The above derivation can be criticized on the grounds that it merely defines a partitioning of the change in internal energy in heat and work in terms of the entropy. As long as we don't define the entropy in terms of the fundamental properties of the system, the fundamental law of thermodynamics is vacuous.
The entropy of an isolated system containing an amount of energy of is defined as:
where is the number of quantum states in a small interval between E and E + δE. Here δE is a macroscopically small energy interval that is kept fixed. Strictly speaking this means that the entropy depends on the choice of δE. However, in the thermodynamic limit (i.e. in the limit of infinitely large system size), the specific entropy (entropy per unit volume or per unit mass) does not depend on δE. The entropy is thus a measure of the uncertainty about exactly which quantum state the system is in, given that we know its energy to be in some interval of size δE.
The fundamental assumption of statistical mechanics is that all the states are equally likely. This allows us to extract all the thermodynamical quantities of interest. The temperature is defined as:
See here for the justification for this definition. Suppose that the system has some external parameter, x, that can be changed. In general, the energy eigenstates of the system will depend on x. According to the Adiabatic theorem of quantum mechanics, in the limit of an infinitely slow change of the system's energy eigenstates, the system will stay in the same energy eigenstate and thus change its energy according to the change in energy of the energy eigenstate it is in.
The generalized force, X, corresponding to the external variable x is defined such that Xdx is the work performed by the system if x is increased by an amount dx. E.g., if x is the volume, then X is the pressure. The generalized force for a system known to be in energy eigenstate Er is given by:
Since the system can be in any energy eigenstate within an interval of δE, we define the generalized force for the system as the expectation value of the above expression:
To evaluate the average, we partition the energy eigenstates by counting how many of them have a value for within a range between Y and Y + δY. Calling this number , we have:
The average defining the generalized force can now be written:
We can relate this to the derivative of the entropy w.r.t. x at constant energy E as follows. Suppose we change x to x + dx. Then will change because the energy eigenstates depend on x, causing energy eigenstates to move into or out of the range between E and E + δE. Let's focus again on the energy eigenstates for which lies within the range between Y and Y + δY. Since these energy eigenstates increase in energy by Y dx, all such energy eigenstates that are in the interval ranging from E - Y dx to E move from below E to above E. There are
such energy eigenstates. If , all these energy eigenstates will move into the range between E and E + δE and contribute to an increase in Ω. The number of energy eigenstates that move from below E + δE to above E + δE is, of course, given by . The difference
is thus the net contribution to the increase in Ω. Note that if Y dx is larger than δE there will be the energy eigenstates that move from below E to above E + δE. They are counted in both and , therefore the above expression is also valid in that case.
Expressing the above expression as a derivative w.r.t. E and summing over Y yields the expression:
The logarithmic derivative of Ω w.r.t. x is thus given by:
The first term is intensive, i.e. it does not scale with system size. In contrast, the last term scales as the inverse system size and will thus vanishes in the thermodynamic limit. We have thus found that:
Combining this with
Gives:
which we can write as: