Negentropy
The negentropy has different meanings in information theory and theoretical biology. In a biological context, the negentropy (also negative entropy, syntropy, extropy, ectropy or entaxy[1]) of a living system is the entropy that it exports to keep its own entropy low; it lies at the intersection of entropy and life. In other words, negentropy is reverse entropy. It means things becoming more in order. By 'order' is meant organisation, structure and function: the opposite of randomness or chaos. The concept and phrase "negative entropy" was introduced by Erwin Schrödinger in his 1944 popular-science book What is Life?[2] Later, Léon Brillouin shortened the phrase to negentropy,[3][4] to express it in a more "positive" way: a living system imports negentropy and stores it.[5] In 1974, Albert Szent-Györgyi proposed replacing the term negentropy with syntropy. That term may have originated in the 1940s with the Italian mathematician Luigi Fantappiè, who tried to construct a unified theory of biology and physics. Buckminster Fuller tried to popularize this usage, but negentropy remains common.
In a note to What is Life? Schrödinger explained his use of this phrase.
“ | [...] if I had been catering for them [physicists] alone I should have let the discussion turn on free energy instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to energy for making the average reader alive to the contrast between the two things. | ” |
Indeed, negentropy has been used by biologists as the basis for purpose or direction in life, namely cooperative or moral instincts.[6]
In 2009, Mahulikar & Herwig redefined negentropy of a dynamically ordered sub-system as the specific entropy deficit of the ordered sub-system relative to its surrounding chaos.[7] Thus, negentropy has SI units of (J kg−1 K−1) when defined based on specific entropy per unit mass, and (K−1) when defined based on specific entropy per unit energy. This definition enabled: i) scale-invariant thermodynamic representation of dynamic order existence, ii) formulation of physical principles exclusively for dynamic order existence and evolution, and iii) mathematical interpretation of Schrödinger's negentropy debt.
Information theory
In information theory and statistics, negentropy is used as a measure of distance to normality.[8][9][10] Out of all distributions with a given mean and variance, the normal or Gaussian distribution is the one with the highest entropy. Negentropy measures the difference in entropy between a given distribution and the Gaussian distribution with the same mean and variance. Thus, negentropy is always nonnegative, is invariant by any linear invertible change of coordinates, and vanishes if and only if the signal is Gaussian.
Negentropy is defined as
where is the differential entropy of the Gaussian density with the same mean and variance as and is the differential entropy of :
Negentropy is used in statistics and signal processing. It is related to network entropy, which is used in Independent Component Analysis.[11][12]
Correlation between statistical negentropy and Gibbs' free energy
There is a physical quantity closely linked to free energy (free enthalpy), with a unit of entropy and isomorphic to negentropy known in statistics and information theory. In 1873, Willard Gibbs created a diagram illustrating the concept of free energy corresponding to free enthalpy. On the diagram one can see the quantity called capacity for entropy. This quantity is the amount of entropy that may be increased without changing an internal energy or increasing its volume.[13] In other words, it is a difference between maximum possible, under assumed conditions, entropy and its actual entropy. It corresponds exactly to the definition of negentropy adopted in statistics and information theory. A similar physical quantity was introduced in 1869 by Massieu for the isothermal process[14][15][16] (both quantities differs just with a figure sign) and then Planck for the isothermal-isobaric process.[17] More recently, the Massieu-Planck thermodynamic potential, known also as free entropy, has been shown to play a great role in the so-called entropic formulation of statistical mechanics,[18] applied among the others in molecular biology[19] and thermodynamic non-equilibrium processes.[20]
- where:
- - is entropy
- - negentropy (Gibbs "capacity for entropy")
- – Massieu potential
- - partition function
- - Boltzmann constant
Risk management
In risk management, negentropy is the force that seeks to achieve effective organizational behavior and lead to a steady predictable state.[21]
Brillouin's negentropy principle of information
In 1953, Léon Brillouin derived a general equation[22] stating that the changing of an information bit value requires at least kT ln(2) energy. This is the same energy as the work Leó Szilárd's engine produces in the idealistic case. In his book,[23] he further explored this problem concluding that any cause of this bit value change (measurement, decision about a yes/no question, erasure, display, etc.) will require the same amount of energy.
See also
Notes
- ↑ Wiener, Norbert
- ↑ Schrödinger, Erwin, What is Life - the Physical Aspect of the Living Cell, Cambridge University Press, 1944
- ↑ Brillouin, Leon: (1953) "Negentropy Principle of Information", J. of Applied Physics, v. 24(9), pp. 1152-1163
- ↑ Léon Brillouin, La science et la théorie de l'information, Masson, 1959
- ↑ Mae-Wan Ho, What is (Schrödinger's) Negentropy?, Bioelectrodynamics Laboratory, Open university Walton Hall, Milton Keynes
- ↑ Jeremy Griffith. 2011. What is the Meaning of Life?. In The Book of Real Answers to Everything! ISBN 9781741290073. From http://www.worldtransformation.com/what-is-the-meaning-of-life/
- ↑ Mahulikar, S.P. & Herwig, H.: (2009) "Exact thermodynamic principles for dynamic order existence and evolution in chaos", Chaos, Solitons & Fractals, v. 41(4), pp. 1939-1948
- ↑ Aapo Hyvärinen, Survey on Independent Component Analysis, node32: Negentropy, Helsinki University of Technology Laboratory of Computer and Information Science
- ↑ Aapo Hyvärinen and Erkki Oja, Independent Component Analysis: A Tutorial, node14: Negentropy, Helsinki University of Technology Laboratory of Computer and Information Science
- ↑ Ruye Wang, Independent Component Analysis, node4: Measures of Non-Gaussianity
- ↑ P. Comon, Independent Component Analysis - a new concept?, Signal Processing, 36 287-314, 1994.
- ↑ Didier G. Leibovici and Christian Beckmann, An introduction to Multiway Methods for Multi-Subject fMRI experiment, FMRIB Technical Report 2001, Oxford Centre for Functional Magnetic Resonance Imaging of the Brain (FMRIB), Department of Clinical Neurology, University of Oxford, John Radcliffe Hospital, Headley Way, Headington, Oxford, UK.
- ↑ Willard Gibbs, A Method of Geometrical Representation of the Thermodynamic Properties of Substances by Means of Surfaces, Transactions of the Connecticut Academy, 382-404 (1873)
- ↑ Massieu, M. F. (1869a). Sur les fonctions caractéristiques des divers fluides. C. R. Acad. Sci. LXIX:858-862.
- ↑ Massieu, M. F. (1869b). Addition au precedent memoire sur les fonctions caractéristiques. C. R. Acad. Sci. LXIX:1057-1061.
- ↑ Massieu, M. F. (1869), Compt. Rend. 69 (858): 1057.
- ↑ Planck, M. (1945). Treatise on Thermodynamics. Dover, New York.
- ↑ Antoni Planes, Eduard Vives, Entropic Formulation of Statistical Mechanics, Entropic variables and Massieu-Planck functions 2000-10-24 Universitat de Barcelona
- ↑ John A. Scheilman, Temperature, Stability, and the Hydrophobic Interaction, Biophysical Journal 73 (December 1997), 2960-2964, Institute of Molecular Biology, University of Oregon, Eugene, Oregon 97403 USA
- ↑ Z. Hens and X. de Hemptinne, Non-equilibrium Thermodynamics approach to Transport Processes in Gas Mixtures, Department of Chemistry, Catholic University of Leuven, Celestijnenlaan 200 F, B-3001 Heverlee, Belgium
- ↑ Pedagogical Risk and Governmentality: Shantytowns in Argentina in the 21st Century (see p. 4).
- ↑ Leon Brillouin, The negentropy principle of information, J. Applied Physics 24, 1152-1163 1953
- ↑ Leon Brillouin, Science and Information theory, Dover, 1956