J.M. Zoido and F. Carren˜o: Geometrical entropies. The extended entropy
461
3 Geometrical uncertainties and geometrical
is formally stated by the following inequality:
entropies
Vii ≥ gii,
(10)
Let P = {p(x)} be the set of multinomial probability dis-
tributions p(x) over the random variable x. These dis-
tributions are parameterized by the M-dimensional real
vector parameter p = (p1, ..., pM ). P can be treated as
a statistical model and it is known as the multinomial
statistical model [1]. When p(x) is sufficiently smooth in
p1, ..., pM , the statistical model P forms a M-dimensional
manifold embedded in the set of all the possible prob-
ability distributions, where p1, ..., pM play the role of a
coordinate system [1,20,21]. Mutual relations of distribu-
tions are then understandable as geometrical properties
of the manifold. The question is, what is the natural ge-
ometric structure to be introduced in a manifold consist-
ing of a statistical model? The answer to this problem is
given by information geometry [1,20,21]. This field studies
the geometrical structures of the manifolds of probability
distributions. An introduction to information geometry is
given in reference [21]. Reference [20] reviews the geome-
try of the manifolds of statistical models. In reference [1]
the information geometry is throughly analyzed.
As we will see in this section, information geome-
try relates in a natural way geometry and uncertainty
(or information). The results provided in this work are
based in this relation. The same idea underlies in the
work by Balian et al. where a geometrical theory of sta-
tistical physics is proposed in terms of the Riemannian
geometry [2].
When the inner product of two vectors belonging to
the tangent space is defined, the manifold P is called a
Riemannian space. In this case, the natural geometrical
structure to be introduced in the manifold of probability
distributions is given by the positive definite Riemannian
metric [1–3,6,20,21,23–26]
where Vii is the variance (uncertainty) associated with pi.
The lower bound in this equation is a particular form of
the Cramer-Rao inequality. Expression (10) relates uncer-
tainty and distance and it has been extended to quantum
mechanics [23,29–31].
The above exposed results show how information ge-
ometry provides a natural relation between uncertainty
and distance. This relation suggests us that functions si
and coefficients gii should be related quantities, i.e.,
si ≡ si(gii) = si(pi).
(11)
The second equality in (11) is a consequence of property
s.1 and it allows us to state the functional dependence
gii = gii(pi).
From relation (11) and property s.2 we obtain
M
M
X
X
S =
si =
si(gii).
(12)
i=1
i=1
This expression points out, as expected from the results
provided by information geometry, how information is ob-
tained from the metric structure defined in the probability
space. With definition (11) uncertainty acquires a clear ge-
ometrical interpretation. This result is in good agreement
with similar ideas proposed by other authors [2,32–35].
Given a metric structure {gij}, each possible distri-
bution of functions s = (si, ..., sM ) generates a different
quantity S in (12). Thus we have the set of measures of
uncertainty
(
)
M
X
Ug = Sg(s) =
si(gii) ∀s
.
(13)
i=1
M
M
X X
ds2 =
gij dpidpj.
(9)
On the other hand, by considering the possible distribu-
tions g = (g11, ..., gMM ) which it is possible to define in a
probability space, we obtain the set of sets
i=1 j=1
It has been demonstrated [1,20,21] that the inner prod-
uct is naturally defined by the covariance of two random
variables. Especially, when the inner product of two vec-
tors belonging to the natural basis of the tangent space
is considered, matrix {gij} in (9) is an important quan-
tity known as the Fisher information [1,20]. Term gii is
a measure of the amount of information due to the event
Ei [27,28]. It was Rao [25] who first proposed the Rieman-
nian structure by using the Fisher information matrix. It
is well known that the only invariant Riemannian metric
is given by the Fisher information [1,20,21]. This is called
the information metric.
When information metric is considered, the statistical
meaning of distance (9) is elucidated by the Cramer-Rao
theorem [1]. According to this theorem, the lowest vari-
ance in an estimation of pi when the remaining probabil-
ities pj (i = j) are unknown is given by gii, {gij} being
the inverse of the Fisher information matrix. This result
ꢄ
U = Ug ∀g/gii = gii(pi) .
(14)
We will refer to U as set of “geometrical uncertainties”.
The conceptual validity of each measure of uncertainty
S ⊂ Ug depends on the behavior, concerning to properties
s.1 to s.8, of function si.
Concavity is an expected property of function si [36].
Thus, given a metric tensor, we can take the concave
function
siln(gii) = −gii ln gii.
(15)
This function generates the following element in Ug:
M
M
X
X
Sg(sln) =
slin(gii) = −
gii ln gii,
(16)
i=1
i=1