Tsallis entropy

From Wikipedia, the free encyclopedia

The Tsallis entropy is a generalization of the standard Boltzmann-Gibbs entropy. It was an extension put forward by Constantino Tsallis in 1988. It is defined as

S_q(p) = {1 \over q - 1} \left( 1 - \int p^q(x)\, dx \right),

or in the discrete case

S_q(p) = {1 \over q - 1} \left( 1 - \sum_x p^q(x) \right).

In this case, p denotes the probability distribution of interest, and q is a real parameter. In the limit as q → 1, the normal Boltzmann-Gibbs entropy is recovered.

The parameter q is a measure of the non-extensitivity of the system of interest. There are continuous and discrete versions of this entropic measure.

[edit] Various relationships

The discrete Tsallis entropy satisfies

S_q = - \left [ D_q \sum_i p_i^x \right ]_{x=1}

where Dq is the q-derivative.


[edit] Non-extensivity

Given two independent systems A and B, for which the joint probability density satisfies

p(A,B) = p(A)p(B),

the Tsallis entropy of this system satisfies

Sq(A,B) = Sq(A) + Sq(B) + (1 − q)Sq(A)Sq(B).

From this result, it is evident that the parameter q is a measure of the departure from extensivity. In the limit when q = 1,

S(A,B) = S(A) + S(B)

which is what is expected for an extensive system.

[edit] See also