Gibbs entropy

From Wikipedia, the free encyclopedia

In thermodynamics, specifically in statistical mechanics, the Gibbs entropy formula is the standard formula for calculating the statistical mechanical entropy of a thermodynamic system,

S = -k_B \sum_i p_i \log p_i \,           (1)

where the summation is taken over the possible states of the system as a whole (typically a 6N-dimensional space, if the system contains N separate particles). This assumes there are no coherent quantum correlations between the probabilities.

The importance of this formula is discussed at much greater length in the main article Entropy (thermodynamics).

This S is almost universally called simply the entropy. It can also be called the statistical entropy or the thermodynamic entropy without changing the meaning. The Von Neumann entropy formula is a slightly more general way of calculating the same thing. The Shannon entropy formula is mathematically and conceptually equivalent to equation (1); the factor of kB out front reflects the choice of base for the logarithm, and has no deep significance.[1] The Boltzmann entropy formula can be seen as a corollary of equation (1), valid under certain restrictive conditions.[2]


[edit] See also

[edit] References

  1. ^ The Laws of Thermodynamics including careful definitions of energy, entropy, et cetera.
  2. ^ Jaynes, E. T. (1965). Gibbs vs Boltzmann entropies. American Journal of Physics, 33, 391-8.