Onsager reciprocal relations

From Wikipedia, the free encyclopedia

In thermodynamics, the Onsager reciprocal relations express the equality of certain ratios between flows and forces in thermodynamic systems out of equilibrium, but where a notion of local equilibrium exists.

"Reciprocal relations" occur between different pairs of forces and flows in a variety of physical systems. For example, consider fluid systems described in terms of temperature, matter density, and pressure. In this class of systems, it is known that temperature differences lead to heat flows from the warmer to the colder parts of the system; similarly, pressure differences will lead to matter flow from high-pressure to low-pressure regions. What is remarkable is the observation that, when both pressure and temperature vary, temperature differences at constant pressure can cause matter flow (as in convection) and pressure differences at constant temperature can cause heat flow. Perhaps surprisingly, the heat flow per unit of pressure difference and the density (matter) flow per unit of temperature difference are equal. This equality was shown to be necessary by Lars Onsager using statistical mechanics as a consequence of the time reversibility of microscopic dynamics (microscopic reversibility). The theory developed by Onsager is much more general than this example and capable of treating more than two thermodynamic forces at once, with the limitation that "the principle of dynamical reversibility does not apply when (external) magnetic fields or Coriolis forces are present", in which case "the reciprocal relations break down".[1]

Though the fluid system is perhaps described most intuitively, the high precision of electrical measurements makes experimental realisations of Onsager's reciprocity easier in systems involving electrical phenomena. In fact, Onsager's 1931 paper[1] refers to thermoelectricity and transport phenomena in electrolytes as well-known from the 19th century, including "quasi-thermodynamic" theories by Thomson and Helmholtz respectively. Onsager's reciprocity in the thermoelectric effect manifests itself in the equality of the Peltier (heat flow caused by a voltage difference) and Seebeck (electrical current caused by a temperature difference) coefficients of a thermoelectric material. Similarly, the so-called "direct piezoelectric" (electrical current produced by mechanical stress) and "reverse piezoelectric" (deformation produced by a voltage difference) coefficients are equal. For many kinetic systems, like the Boltzmann equation or chemical kinetics, the Onsager relations are closely connected to the principle of detailed balance[1] and follow from them in the linear approximation near equilibrium.

Experimental verifications of the Onsager reciprocal relations were collected and analyzed by D.G. Miller[2] for many classes of irreversible processes, namely for thermoelectricity, electrokinetics, transference in electrolytic solutions, diffusion, conduction of heat and electricity in anisotropic solids, thermomagnetism and galvanomagnetism. In this classical review, chemical reactions are considered as "cases with meager" and with inconclusive evidence. Further theoretical analysis and experiments support the reciprocal relations for chemical kinetics with transport.[3]

For his discovery of these reciprocal relations, Lars Onsager was awarded the 1968 Nobel Prize in Chemistry. The presentation speech referred to the three laws of thermodynamics and then added "It can be said that Onsager's reciprocal relations represent a further law making a thermodynamic study of irreversible processes possible."[4] Some authors have even described Onsager's relations as the "Fourth law of thermodynamics".[5]

Example: Fluid system

The fundamental equation

The basic thermodynamic potential is internal energy. In a simple fluid system, neglecting the effects of viscosity the fundamental thermodynamic equation is written:

dU=T\,dS-P\,dV+\mu \,dM

where U is the internal energy, T is temperature, S is entropy, P is the hydrostatic pressure, \mu is the chemical potential, and M mass. In terms of the internal energy density, u, entropy density s, and mass density \rho , the fundamental equation is written:

du=T\,ds+\mu \,d\rho

For non-fluid or more complex systems there will be a different collection of variables describing the work term, but the principle is the same. The above equation may be solved for the entropy density:

ds=(1/T)\,du+(-\mu /T)\,d\rho

The above expression of the first law in terms of entropy change defines the entropic conjugate variables of u and \rho , which are 1/T and -\mu /T and are intensive quantities analogous to potential energies; their gradients of are called thermodynamic forces as they cause flows of the corresponding extensive variables as expressed in the following equations.

The continuity equations

The extensive quantities U and M are conserved and their flows satisfy continuity equations. The conservation of mass is written:

{\frac  {\partial \rho }{\partial t}}+\nabla \cdot {\mathbf  {J}}_{\rho }=0

and, assuming that fluid velocity makes a negligible contribution to the energy flow, the conservation of energy is simply the conservation of the internal energy:

{\frac  {\partial u}{\partial t}}+\nabla \cdot {\mathbf  {J}}_{u}=0

where {\mathbf  {J}}_{\rho } is the mass flux vector and {\mathbf  {J}}_{u} is the heat flux vector.

The entropy is not conserved and its continuity equation is written:

{\frac  {\partial s}{\partial t}}+\nabla \cdot {\mathbf  {J}}_{s}={\frac  {\partial s_{c}}{\partial t}}

where {\frac  {\partial s_{c}}{\partial t}} is the rate of increase in entropy density due to the irreversible processes of equilibration occurring in the fluid.

The phenomenological equations

In the absence of matter flows, Fourier's law is usually written:

{\mathbf  {J}}_{{u}}=-k\,\nabla T;

where k is the thermal conductivity. However, this law is just a linear approximation, and holds only for the case where \nabla T\ll T, with the thermal conductivity possibly being a function of the thermodynamic state variables, but not their gradients or time rate of change. Assuming that this is the case, Fourier's law may just as well be written:

{\mathbf  {J}}_{u}=kT^{2}\nabla (1/T);

In the absence of heat flows, Fick's law of diffusion is usually written:

{\mathbf  {J}}_{{\rho }}=-D\,\nabla \rho ,

where D is the coefficient of diffusion. Since this is also a linear approximation and since the chemical potential is monotonically increasing with density at a fixed temperature, Fick's law may just as well be written:

{\mathbf  {J}}_{{\rho }}=D'\,\nabla (-\mu /T)\!

where, again, D' is a function of thermodynamic state parameters, but not their gradients or time rate of change. For the general case in which there both mass and energy fluxes, the phenomenological equations may be written as:

{\mathbf  {J}}_{{u}}=L_{{uu}}\,\nabla (1/T)+L_{{u\rho }}\,\nabla (-\mu /T)
{\mathbf  {J}}_{{\rho }}=L_{{\rho u}}\,\nabla (1/T)-L_{{\rho \rho }}\,\nabla (-\mu /T)

or, more concisely,

{\mathbf  {J}}_{\alpha }=\sum _{\beta }L_{{\alpha \beta }}\,\nabla f_{\beta }

where the entropic "thermodynamic forces" conjugate to the "displacements" u and \mu are f_{u}=(1/T) and f_{\rho }=(-\mu /T) and L_{{\alpha \beta }} is the Onsager matrix of phenomenological coefficients.

The rate of entropy production

From the fundamental equation, it follows that:

{\frac  {\partial s}{\partial t}}=(1/T){\frac  {\partial u}{\partial t}}+(-\mu /T){\frac  {\partial \rho }{\partial t}}

and

{\mathbf  {J}}_{s}=(1/T){\mathbf  {J}}_{u}+(-\mu /T){\mathbf  {J}}_{\rho }=\sum _{\beta }{\mathbf  {J}}_{\alpha }f_{\alpha }

Using the continuity equations, the rate of entropy production may now be written:

{\frac  {\partial s_{c}}{\partial t}}={\mathbf  {J}}_{u}\cdot \nabla (1/T)+{\mathbf  {J}}_{\rho }\cdot \nabla (-\mu /T)=\sum _{\alpha }{\mathbf  {J}}_{\alpha }\cdot \nabla f_{\alpha }

and, incorporating the phenomenological equations:

{\frac  {\partial s_{c}}{\partial t}}=\sum _{\alpha }\sum _{\beta }L_{{\alpha \beta }}(\nabla f_{\alpha })(\nabla f_{\beta })

It can be seen that, since the entropy production must be greater than zero, the Onsager matrix of phenomenological coefficients L_{{\alpha \beta }} is a positive semi-definite matrix.

The Onsager reciprocal relations

Onsager's contribution was to demonstrate that not only is L_{{\alpha \beta }} positive semi-definite, it is also, except in certain special cases, symmetric. In other words, the cross-coefficients \ L_{{u\rho }} and \ L_{{\rho u}} are equal. The fact that they are at least proportional follows from simple dimensional analysis (i.e., both coefficients are measured in the same units of temperature times mass density).

The rate of entropy production for the above simple example uses only two entropic forces, and a 2x2 Onsager phenomenological matrix. The expression for the linear approximation to the fluxes and the rate of entropy production can very often be expressed in an analogous way for many more general and complicated systems.

Abstract formulation

Let x_{1},x_{2},\ldots ,x_{n} denote fluctuations from equilibrium values in several thermodynamic quantities, and let S(x_{1},x_{2},\ldots ,x_{n}) be the entropy. Then, Boltzmann's entropy formula gives for the probability distribution function w=A\exp(S/k), A=const, since the probability of a given set of fluctuations {x_{1},x_{2},\ldots ,x_{n}} is proportional to the number of microstates with that fluctuation. Assuming the fluctuations are small, the probability distribution function can be expressed through the second differential of the entropy[6]

w=Ae^{{-{\frac  {1}{2}}\beta _{{ik}}x_{i}x_{k}}}\,;\;\;\;\;\ \beta _{{ik}}=-{\frac  {1}{k}}{\frac  {\partial ^{2}S}{\partial x_{i}\partial x_{k}}}\,,

where we are using Einstein summation convention and \beta _{{ik}} is a positive definite symmetric matrix.

Using the quasi-stationary equilibrium approximation, that is, Assuming that the system is only slightly non-equilibrium, we have[6] {\dot  {x}}_{i}=-\lambda _{{ik}}x_{k}

Suppose we define thermodynamic conjugate quantities as X_{i}=-{\frac  {1}{k}}{\frac  {\partial S}{\partial x_{i}}}, which can also be expressed as linear functions (for small fluctuations): X_{i}=\beta _{{ik}}x_{k}

Thus, we can write {\dot  {x}}_{i}=-\gamma _{{ik}}X_{k} where \gamma _{{ik}}=\lambda _{{il}}\beta _{{lk}}^{{-1}} are called kinetic coefficients

The principle of symmetry of kinetic coefficients or the Onsager's principle states that \gamma is a symmetric matrix, that is \gamma _{{ik}}=\gamma _{{ki}}[6]

Proof

Define mean values \xi _{i}(t) and \Xi _{i}(t) of fluctuating quantities x_{i} and X_{i} respectively such that they take given values x_{1},x_{2},\ldots at t=0 Note that {\dot  {\xi }}_{i}(t)=-\gamma _{{ik}}\Xi _{k}

Symmetry of fluctuations under time reversal implies that \langle x_{i}(t)x_{k}(0)\rangle =\langle x_{i}(-t)x_{k}(0)\rangle =\langle x_{i}(0)x_{k}(t)\rangle

or, with \xi _{i}(t), we have \langle \xi _{i}(t)x_{k}\rangle =\langle x_{i}\xi _{k}(t)\rangle

Differentiating with respect to t and substituting, we get \gamma _{{il}}\langle \Xi _{l}(t)x_{k}\rangle =\gamma _{{kl}}\langle x_{i}\Xi _{l}(t)\rangle

Putting t=0 in the above equation, \gamma _{{il}}\langle X_{l}x_{k}\rangle =\gamma _{{kl}}\langle X_{l}x_{i}\rangle

It can be easily shown from the definition that \langle X_{i}x_{k}\rangle =\delta _{{ik}}, and hence, we have the required result.

See also

References

  1. 1.0 1.1 1.2 L. Onsager, Reciprocal Relations in Irreversible Processes. I., Phys. Rev. 37, 405 - 426 (1931)
  2. D.G. Miller, Thermodynamics of irreversible processes. The experimental verification of the Onsager reciprocal relations, Chem. Rev. 60 (1960), 15-37.
  3. G.S. Yablonsky, A.N. Gorban, D. Constales, V.V. Galvita and G.B. Marin, Reciprocal relations between kinetic curves, EPL, 93 (2011) 20004.
  4. The Nobel Prize in Chemistry 1968. Presentation Speech.
  5. For example Richard P. Wendt, Journal of Chemical Education v.51, p.646 (1974) "Sîmplified Transport Theory for Electrolyte Solutions"
  6. 6.0 6.1 6.2 Landau, L. D.; Lifshitz, E.M. (1975). Statistical Physics, Part 1. Oxford, UK: Butterworth-Heinemann. ISBN 978-81-8147-790-3. 
This article is issued from Wikipedia. The text is available under the Creative Commons Attribution/Share Alike; additional terms may apply for the media files.