Hubbry Logo
search
logo

Tsallis entropy

logo
Community Hub0 Subscribers
Write something...
Be the first to start a discussion here.
Be the first to start a discussion here.
See all
Tsallis entropy

In physics, the Tsallis entropy is a generalization of the standard Boltzmann–Gibbs entropy. It is proportional to the expectation of the q-logarithm of a distribution.

The concept was introduced in 1988 by Constantino Tsallis as a basis for generalizing the standard statistical mechanics and is identical in form to Havrda–Charvát structural α-entropy, introduced in 1967 within information theory.

Given a discrete set of probabilities with the condition , and any real number, the Tsallis entropy is defined as

where is a real parameter sometimes called entropic-index and a positive constant.

In the limit as , the usual Boltzmann–Gibbs entropy is recovered, namely

where one identifies with the Boltzmann constant .

For continuous probability distributions, we define the entropy as

where is a probability density function.

See all
User Avatar
No comments yet.