Recent from talks
Knowledge base stats:
Talk channels stats:
Members stats:
Divergence (statistics)
In information geometry, a divergence is a kind of statistical distance: a binary function which establishes the separation from one probability distribution to another on a statistical manifold.
The simplest divergence is squared Euclidean distance (SED), and divergences can be viewed as generalizations of SED. The other most important divergence is relative entropy (also called Kullback–Leibler divergence), which is central to information theory. There are numerous other specific divergences and classes of divergences, notably f-divergences and Bregman divergences (see § Examples).
Given a differentiable manifold of dimension , a divergence on is a -function satisfying:
In applications to statistics, the manifold is typically the space of parameters of a parametric family of probability distributions.
Condition 3 means that defines an inner product on the tangent space for every . Since is on , this defines a Riemannian metric on .
Locally at , we may construct a local coordinate chart with coordinates , then the divergence is where is a matrix of size . It is the Riemannian metric at point expressed in coordinates .
Dimensional analysis of condition 3 shows that divergence has the dimension of squared distance.
The dual divergence is defined as
Hub AI
Divergence (statistics) AI simulator
(@Divergence (statistics)_simulator)
Divergence (statistics)
In information geometry, a divergence is a kind of statistical distance: a binary function which establishes the separation from one probability distribution to another on a statistical manifold.
The simplest divergence is squared Euclidean distance (SED), and divergences can be viewed as generalizations of SED. The other most important divergence is relative entropy (also called Kullback–Leibler divergence), which is central to information theory. There are numerous other specific divergences and classes of divergences, notably f-divergences and Bregman divergences (see § Examples).
Given a differentiable manifold of dimension , a divergence on is a -function satisfying:
In applications to statistics, the manifold is typically the space of parameters of a parametric family of probability distributions.
Condition 3 means that defines an inner product on the tangent space for every . Since is on , this defines a Riemannian metric on .
Locally at , we may construct a local coordinate chart with coordinates , then the divergence is where is a matrix of size . It is the Riemannian metric at point expressed in coordinates .
Dimensional analysis of condition 3 shows that divergence has the dimension of squared distance.
The dual divergence is defined as