User:David Shear/Entropy of Mixing
From Wikipedia, the free encyclopedia
The entropy of mixing is the uncertainty about the spatial locations of the various kinds of molecules in a mixture. In a pure condensed phase, there is no spatial uncertainty: everywhere we look, we find the same kind of molecule. A single-component gas is mostly empty space, but when we do encounter a molecule, the is no doubt about what kind it is. When two or more substances are interdispersed, we may know the various proportions, but we have no way of knowing which kind of molecule is where. The notion of "finding" a molecule in a given location is a thought experiment, since we can not actually examine spatial locations the size of molecules. Moreover, individual molecules of a given kind are all identical, so we never ask which one is where. "Interchanging" two identical objects is not a real process; it does not lead to a physically distinct condition.
Some liquids will mix while others are immiscible. Gases always intermix since free molecules will always move into empty space. Solid mixtures can be prepared by cooling liquid mixtures. Solutions are mixtures in which one component, the solvent, predominates. Many mixtures combine materials initially in different states of matter; e.g., liquids in which solids or gases are dissolved. Assume any mixing process has reached thermodynamic equilibrium so that the final material is homogeneous. But it is irrelevant how a mixture came to be. In addition to actual mixing, components can be formed from others by chemical reactions.
Derivations of the entropy of mixing usually begin with free energy functions and chemical potentials, resulting in unnecessarily complex arguments. The theory is unified so long as large disparities in molecular sizes do not influence the results. (See the Flory-Huggins solution theory for solutions of long-chain polymers.) This unity results in mole fractions appearing in the chemical potentials of both gases and solutes in solutions.
Contents |
[edit] Strategy
Imagine space to be subdivided into a lattice whose cells are the size of the molecules. It does not have to be square; any lattice will do, including close-packing. Molecules obey a classical exclusion principle: only one object can be in a given place at any one time. This is due to interatomic forces which generate the steric effects, but the use of geometrized constraints is common in theoretical physics and physical chemistry.
Picture an enormous checkerboard. Relax the condition that the black and white squares are equal in number. Generalize to three dimensions. For a gas, imagine that a great excess of white squares represents empty space. For a mixture, give each component its own color. Now imagine all possible spatial rearrangements. This is our model of the different configurations for the molecules in a system.
A pure condensed phase has little spatial uncertainty, with one of its molecules (almost) everywhere we look. A mixture is still dense with molecules, but now there is uncertainty about what kind of molecule is in any given location. A pure crystal has its own intrinsic lattice; the same kind of molecule occupies every site. Mixed crystals can be formed from molecules with isotopic substitutions, or from closely related chemical species. For less ordered condensed phases, we will use an artificial geometrical lattice to assign locations to molecular centers of mass. Greater molecular disorder in liquids and amorphous solids as compared to crystals shows up as free volume; a liquid is (usually) less dense than its own crystalline phase. For each system, assume we can choose an optimum lattice with cells small enough so that most hold only one molecule, but large enough so that most cells are filled.
A gas has a huge amount of spatial uncertainty because most of the volume is empty space, which plays the role of “solvent”. For a single-component gas, the only question is: does a lattice site contain the center of mass of a gas molecule, or is it empty? The entropy increase accompanying the free expansion of a gas into a vacuum may be regarded as the entropy of mixing of the gas with empty space. In a mixture of gases, there is a second question, which arises only for occupied sites: which kind of molecule is present?
[edit] Boltzmann's method
The fundamental assumption of statistical mechanics is that each possible way of achieving a macroscopic state is equally likely. Boltzmann's equation for the entropy is
in which W is the number of (unobservable) microscopic "ways" the molecules can be assigned to different conditions or states consistent with the overall macroscopic thermodynamic condition of a system and k is Boltzmann’s constant. We will apply this to the number of ways a mixture of different kinds of molecules can be arranged in space.
The justification for splitting position-momentum phase space into a position part, which we will use, and a momentum (energy) part, which we will ignore, is that for all molecular materials at room temperature, the thermal de Broglie wavelength is much less than intermolecular distances; in fact, it is less than actual molecular diameters. In this classical limit Heisenberg's uncertainty principle is irrelevant. We can talk about a classical gas expanding from one corner of an enclosure to fill an entire enclosure, a process which has no sensible meaning using the Schrödinger time-independent wave equation.
Consider a mixture of molecules of two kinds.
in which N * is the total number of lattice sites, N1 is the number of molecules of component 1, N2 is the number of molecules of component 2, and N0 is the number of empty lattice sites — which is zero for a crystal, and small for other condensed phases. The total number of molecules is
- N = N1 + N2
[edit] Shannon's method
A shorter and more logically transparent method, not requiring require Stirling's approximation, is to use Shannon's definition for entropy in calculating the compositional uncertainty
We employ the same (real or) conceptual lattice, where
is the probability that a molecule of i is in any given lattice site, equal to the number of molecules of i, Ni, divided by the number of lattice sites, N. The summation is over all the chemical species present, so this is the uncertainty about which kind of molecule (if any) is in any one site. It must be multiplied by the total number of sites to get the spatial uncertainty for the whole system.
[edit] Condensed phases
- - - - - - - - - - - - - - -
Consider a mixture of molecules of two kinds. We are after the number of possible patterns or configurations achievable with N1 molecules of component 1 and N2 molecules of component 2 arranged on a lattice with N * total sites. This is given by the formula for the permutations of N * things subject to the condition that N2 of them are identical, and likewise for N1 and N0.
where N0 is the number of empty lattice sites — zero for a crystal (0! = 1) and a small fraction for other condensed phases, but by far the greatest part of N * in a gas. It can also be taken as the number of solute molecules in a solution, demonstrating the analogy. The total number of (other) molecules is N = N1 + N2. The logarithm of the result with N2 = 0 gives both the spatial uncertainty in a gas and the entropy of mixing of a single component with a solvent (N0).
- - - - - - - - - - - - - - -
Let us proceed first by using the traditional Boltzmann formula. The simplest case is a mixture with only two components, 1 and 2. We are after the number of possible patterns or configurations achievable with N1 molecules of 1 and N2 molecules of 2 arranged on a lattice with N * sites. For a condensed phase, the number of sites is equal to the total number of molecules, N = N1 + N2.
The number of distinct configurations W is given by the formula for the permutations of N things subject to the condition that N1 of them are identical, and likewise for N2.
Using this algebraic form in Boltzmann's equation and applying Stirling's approximation for the logarithms of factorials, the configurational uncertainty, or entropy of mixing, turns out to be
which has been written using the conventional ΔS notation (Δ denotes a change), suggesting that the mixture has been formed by a mixing process from two separate pure phases, each of which originally had no spatial uncertainty. This expression can be generalized to a mixture of r components, with i = 0, 1, 2, 3, ... r
We have introduced the mole fractions, which are also the probabilities of finding any particular component in a given lattice site.
For the two-component case,
where R is the gas constant, equal to k times Avogadro's number, n1 and n2 are the numbers of moles of the components, and n is the total number of moles. Since the mole fractions are necessarily less than one, the values of the logarithms are negative. The minus sign reverses this, giving a positive entropy of mixing, as expected.
Shannon's formula yields the desired result directly.
The summation is over the various chemical species, so this is the uncertainty about which kind of molecule is in any one site. It must be multiplied by the number of sites N to get the uncertainty for the whole system. Doing this, and using the fact that pi = Ni / N, we obtain
which is the same as the result obtained using Boltzmann's formula. The two methods are essentially equivalent. (But see the Discussion.)
[edit] Solutions
If the solute is a crystalline solid, the argument is much the same. A crystal has no spatial uncertainty at all, except for crystallographic defects, and a (perfect) crystal allows us to localize the molecules using the crystal symmetry group. The fact that volumes do not add when dissolving a solid in a liquid is not important for condensed phases. If the solute is not crystalline, we can still use a spatial lattice, as good an approximation for an amorphous solid as it is for a liquid.
The Flory-Huggins solution theory provides the entropy of mixing for polymer solutions, in which the macromolecules are huge compared to the solute molecules. In this case, the assumption is made that each monomer subunit in the polymer chain occupies a lattice site.
Note that solids in contact with each other also slowly interdiffuse, and solid mixtures of two or more components may be made at will (alloys, semiconductors, etc.). Again, the same equations for the entropy of mixing apply, but only for homogeneous, uniform phases.
[edit] Gases
In order to get the total entropy of a gas, we must also calculate the contingent uncertainty about the momentum of a molecule for each lattice site that is found to contain one. We obtain a Boltzmann distribution over energies and a partition function which depend on (and define) the temperature, and add this to the spatial uncertainty. But in regard to mixing, we are concerned only with spatial entropy.
If we have a pure gas consisting of N1 molecules, we want to calculate the number of ways, or occupancy patterns, W of arranging N1 occupied sites and N0 empty sites on a lattice with N total sites.
and
But we have just performed this calculation above, although with a different interpretation of N0 = N2. Clearly, the spatial uncertainty in gas entropy is just the entropy of mixing of gas molecules and empty space. For a pure gas, considering just the spatial uncertainty part of the entropy,
The simplification is possible because N0 / N is just slightly less than one and its log is negligible; most of the space in a gas is empty lattice sites. Note that ρ1 = (N1 / Nv) = (N1 / V) is the molecular concentration, or number density, of the gas molecules, where v is the volume of a single lattice site and V is the total volume of the system. The reciprocal of this quantity is the volume per molecule, V / N1. So long as this is large with respect to Λ3, the cube of the thermal de Broglie wavelength, we can be sure that the "wave packets" for the molecules hardly ever touch, and the classical mechanical treatment is the appropriate one. For all real gases at room temperature, this condition is more than satisfied.
In the ideal gas approximation, which is pretty good for dilute gases at normal temperatures, volumes are additive for two samples of different gases combined at constant T and P. In any case, let N2 be the number of molecules of a second type of gas in a mixture. The spatial part of the entropy of the mixture is k times the log of - - - - - - - - - - - - - - - Consider a mixture of molecules of two kinds.
in which N * is the total number of lattice sites, N1 is the number of molecules of component 1, N2 is the number of molecules of component 2, and N0 is the number of empty lattice sites — which is zero for a crystal, and small for other condensed phases. The total number of molecules is
- N = N1 + N2
- - - - - - - - - - - - - - -
We can regard the mixing of two kinds of gas (at constant T and P) as simply conjoining the two containers. The two lattices which allow us to conceptually localize molecular centers of mass also join. The total number of empty cells is the sum of the numbers of empty cells in the two components prior to mixing. Consequently, that part of the spatial uncertainty concerning whether any molecule is present in a lattice cell is the sum of the initial values, and does not increase upon mixing.
Almost everywhere we look, we find empty lattice sites. But for those few sites which are occupied, there is a contingent uncertainty about which kind of molecule it is. Using conditional probabilities, it turns out that the analytical problem for the small subset of occupied cells is exactly the same as for mixed liquids, and the increase in the entropy, or spatial uncertainty, has exactly the same form as obtained previously. Obviously the subset of occupied cells is not the same at different times.
See also: Gibbs Paradox, in which it would seem that mixing two samples of the same gas would produce entropy. The derivation given here avoids this "paradox", since if the molecules are all of the same kind, there is no entropy increase. Underlying this success is the fact that we are drawing a distinction between "identical" and "indistinguishable". Identical objects are distinguishable if they are in different places, even though they can not be intrinsically labeled. Identical objects can be treated as distinguishable if their wavefunctions do not sensibly overlap.
[edit] Discussion
The preceding analysis is only an approximation, except for dilute gases. It is not too bad for mixtures of denser gases, or for liquids or amorphous solids with molecules of about the same size. Likewise for crystalline mixtures. We have not considered intermolecular forces (energies). Mixing substances whose molecules cross-react differently than they do in their pure phases results in a (positive or negative) heat (or enthalpy) of mixing, in addition to considerations of entropy. We have ignored any correlations in the dispositions of neighboring molecules, including angular orientations due to molecular shapes, or due to any other geometrical or energetic reason, such as clouds of counter-ions surrounding charged colloidal particles. The fundamental assumption is that all occupancy patters, or spatial "microstates", are counted as equally likely. But biasing effects of near neighbor interactions could perhaps be incorporated into the theory.
It is desirable to maintain the form of the equations derived above, even if correction factors (activity coefficients) are required. Whenever possible, deviations from ideality are managed by multiplying mole fractions (or concentrations) by experimentally or theoretically determined activity coefficients to handle deviations from ideality for both entropy and energy. Mixing substances with gross dissymmetries in size requires a better mathematical model. For long-chain polymers, see the Flory-Huggins solution theory.
There is a tacit mathematical assumption involved in using the Shannon entropy which might have escaped notice, which makes it differ in an interesting way from the Boltzmann formula. If we "find" a molecule of type 2 in the first location we examine, there are only N2 − 1 molecules of 2 left to be found in the N − 1 remaining lattice sites. That is, one site and one molecule of 2 have each been "used up" and we should proceed only after taking that into account. This is possible but algebraically messy. However, it is not a problem in the thermodynamic limit of large systems, where we can regard our system as a smaller subsystem defined by geometrical "walls" through which molecules can pass. In this case, N2 − 1 is a time-average value and not a rigid constraint. This is the idea behind Gibbs' grand canonical ensemble. But for systems of finite size, it is the original Boltzmann formulation for entropy in terms of factorials which is really correct, since it uses the actual particle numbers, making the presentation of the "long way" instructive. The use of Stirling's approximation also eliminates any mathematical distinction between the two ensembles, producing the same final results and making epistemological arguments about their inner meanings moot. For systems with a small number of particles, if it is possible to use the Boltzmann formula without Stirling's approximation, that would give the more accurate result. However, the idea that the canonical ensemble represents an external heat bath which maintains constant T by maintaining an average internal energy still stands.
[edit] Notes
- ↑ In addition to spatial uncertainty, all substances also have uncertainty regarding the energies of their molecules (or degrees of freedom). This part of the entropy can be determined by integrating the specific heat over the absolute temperature from T = 0 up to the ambient temperature. (See Measuring entropy.)
- ↑ Amorphous materials such as glasses do not go through sharp transitions to a crystalline state when they solidify and may be considered a variation on liquids. Thermoplastic materials generally have crystalline regions below their melting temperature and are more complex. Solutions of long-chain polymers have their own treatment; see the Flory-Huggins theory.
- ↑ 1a. There is often a volume change on mixing: the initial volumes don't quite sum to the volume of the mixture due to interstitial packing and specific molecular interactions.
- ↑ 2. Claude Shannon introduced this expression for use in information theory, but similar formulas can be found as far back as the work of Ludwig Boltzmann and J. Willard Gibbs. Shannon uncertainty is completely unrelated to the Heisenberg uncertainty principle in quantum mechanics. Note that "uncertainty" in the entropic sense is unrelated to Heisenberg uncertainty.
[edit] External links
While this reasoning yields the correct ideal gas equation of state, it also leads to Gibbs paradox, in which it (erroneously) appears that mixing two samples of the same kind of gas leads to an increase in entropy.
only to the different spatial arrangements: W will be