Gibbs entropy

From Wikipedia, the free encyclopedia

In thermodynamics, specifically in statistical mechanics, the Gibbs entropy formula is the standard formula for calculating the statistical mechanical entropy of a thermodynamic system,

 S = -k_B \sum_i p_i \ln p_i \,           (1)

where the summation is taken over the possible states of the system as a whole (typically a 6N-dimensional space, if the system contains N separate particles). An overestimation of entropy will occur if all correlations, and more generally if statistical dependence between the state probabilities are ignored. These correlations occur in systems of interacting particles, that is, in all systems more complex than an ideal gas.

The Shannon entropy formula is mathematically and conceptually equivalent to equation (1); the factor of kB out front reflects two facts: our choice of base for the logarithm, [1] and our use of an arbitrary temperature scale with water as a reference substance.

The importance of this formula is discussed at much greater length in the main article Entropy (thermodynamics).

This S is almost universally called simply the entropy. It can also be called the statistical entropy or the thermodynamic entropy without changing the meaning. The Von Neumann entropy formula is a slightly more general way of calculating the same thing. The Boltzmann entropy formula can be seen as a corollary of equation (1), valid under certain restrictive conditions of no statistical dependence between the states.[2]


[edit] See also

[edit] References

  1. ^ The Laws of Thermodynamics including careful definitions of energy, entropy, et cetera.
  2. ^ Jaynes, E. T. (1965). Gibbs vs Boltzmann entropies. American Journal of Physics, 33, 391-8.