Entropy (ecology)

From Wikipedia, the free encyclopedia


Ecological entropy is a measure of biodiversity in the study of biological ecology.

[edit] Definition

Assume that within a system of unlimited individuals, there exists n species of organisms (A1,A2,...,Aj,...,An), and the probability that an individual belongs to species Aj is pj, where

\sum_{i=1}^n p_i=1

If there are q individuals for each species, the population will be q1 + q2 + ... + qj + ... + qn. The number of individuals is qn = nlogq. Set N as constant C.

Diversity within each species, such as male, female, large, small, etc. can be expressed as

C \sum_{j} p_i \log q_j = C \sum_{j}p_i \log p_I + C \log q \quad (\mbox{since }q_j = qp_I)

Hence the specific diversity can be derived by subtracting the diversities within the species from the diversity of the whole population:

\ C \log q - (C \sum_{j} p_i \log p_i + C \log q) = -C \sum_{j}p_i \log p_i \

The expression is the same as that of information entropy, except that it is customary to use binary logarithm (base 2) for information entropy, the unit being bit, and to use common logarithm (base 10) for ecological entropy, the unit being bel. The thermodynamic entropy uses natural logarithm (base e), so that the constant C becomes the Boltzmann constant, the ratio of universal gas constant and Avogadro's number.

[edit] See also

In other languages