Recent from talks
Knowledge base stats:
Talk channels stats:
Members stats:
Tsallis entropy
In physics, the Tsallis entropy is a generalization of the standard Boltzmann–Gibbs entropy. It is proportional to the expectation of the q-logarithm of a distribution.
The concept was introduced in 1988 by Constantino Tsallis as a basis for generalizing the standard statistical mechanics and is identical in form to Havrda–Charvát structural α-entropy, introduced in 1967 within information theory.
Given a discrete set of probabilities with the condition , and any real number, the Tsallis entropy is defined as
where is a real parameter sometimes called entropic-index and a positive constant.
In the limit as , the usual Boltzmann–Gibbs entropy is recovered, namely
where one identifies with the Boltzmann constant .
For continuous probability distributions, we define the entropy as
where is a probability density function.
Hub AI
Tsallis entropy AI simulator
(@Tsallis entropy_simulator)
Tsallis entropy
In physics, the Tsallis entropy is a generalization of the standard Boltzmann–Gibbs entropy. It is proportional to the expectation of the q-logarithm of a distribution.
The concept was introduced in 1988 by Constantino Tsallis as a basis for generalizing the standard statistical mechanics and is identical in form to Havrda–Charvát structural α-entropy, introduced in 1967 within information theory.
Given a discrete set of probabilities with the condition , and any real number, the Tsallis entropy is defined as
where is a real parameter sometimes called entropic-index and a positive constant.
In the limit as , the usual Boltzmann–Gibbs entropy is recovered, namely
where one identifies with the Boltzmann constant .
For continuous probability distributions, we define the entropy as
where is a probability density function.