Definition (Entropy).
The entropy of a discrete random variable
is a
quantity
that takes real values and has the following properties:
-
(i)
Normalisation: If
is uniform on
then .
-
(ii)
Invariance: If
takes values in ,
takes values in ,
is a bijection from
to ,
and for every
we have ,
then .
-
(iii)
Extendability: If
takes values in a set ,
and
is disjoint from ,
takes values in ,
and for all
we have ,
then .
-
(iv)
Maximality: If
takes values in a finite set
and
is uniformly distributed in ,
then .
-
(v)
Continuity:
depends continuously on
with respect to total variation distance (defined by the distance between
and
is ).
For the last axiom we need a definition:
Let and
be random variables.
The conditional entropy
of given
is
-
(vi)
Additivity: .