User:Sadi Carnot/Sandbox8

From Wikipedia, the free encyclopedia

In this direction, a number of authors, in recent years, have derived exact entropy formulas to account for and measure disorder and order in atomic and molecular assemblies.[1][2][3][4] One of the simpler entropy order/disorder formulas is that derived in 1984 by thermodynamic physicist Peter Landsberg, which is based on a combination of thermodynamics and information theory arguments. Landsberg argues that when constraints operate on a system, such that it is prevented from entering one or more of its possible or permitted states, as contrasted with its forbidden states, the measure of the total amount of “disorder” in the system is given by the following expressions:

Disorder=C_D/C_I\,

Similarly, the total amount of "order" in the system is given by:

Order=1-C_O/C_I\,

In which CD is the "disorder" capacity of the system, which is the entropy of the parts contained in the permitted ensemble, CI is the "information" capacity of the system, an expression similar to Shannon's channel capacity, and CO is the "order" capacity of the system.[2]

[edit] Definitions

Shown below, are current science encyclopedia and science dictionary definitions of entropy:

  • Entropy – a measure of the unavailability of a system’s energy to do work; also a measure of disorder; the higher the entropy the greater the disorder.[5]
  • Entropy – a measure of disorder; the higher the entropy the greater the disorder.[6]
  • Entropy – in thermodynamics, a parameter representing the state of disorder of a system at the atomic, ionic, or molecular level; the greater the disorder the higher the entropy.[7]
  • Entropy – a measure of disorder in the universe or of the availability of the energy in a system to do work.[8]