Mathematical Measure of Information

advertisement
Mathematical Measure of Information
Self - information
Consider a discrete random variable
with possible value
self - information of the event
( )
(
the
is:
( )
)
( )
We note that a high probability event conveys less information than the low
probability events. For an event with
( )
( )
. Since a low probability
implies a higher degree of uncertainty (and vice-versa), a random variable with high degree
( )
of uncertainty contains more information. Since
( )
,
i.e., self - information is non – negative
Example:
Consider a binary source which tosses a fair coin and outputs a 1 if a head (H) appears and a
0 if a trail (T) appears. For this source,
( )
( )
. The information content of
each output from the source is:
Indeed, we have to use only one bit to represent the output from this binary source.
Now, suppose the successive outputs from binary source are statistically independent, i.e.,
the source is memory-less. Consider a block of
-bits. There are
possible
-bit
blocks, each of which is equally probable with probability
The self-information of an
-bits block is
1|Page
College Of Information Technology – Information Theory
Mutual – Information
Suppose we have some outcome
and we want to determine the amount of
information this event provides about event
, i.e., we want to mathematically
represent the mutual information.
The mutual information
(
)
(
(
)
(
)
( )
( )
between
and
is defined as:
)
Prove:
(
The physical interpretation of
)
(
provided by the occurrence of the event
) is as follows. The information
about the event
the information provided by the occurrence of the event
is identical to
about the event
.
We note two extreme cases:
i.
When the random variables
( ) , it leads to (
ii.
)
When the occurrence of
(
(
)
)
(
and
are statistically independent,
(
)
.
uniquely determine the occurrence of the event
the mutual information becomes
( )
)
This is the self-information of the event
.
2|Page
College Of Information Technology – Information Theory
Example:
Consider a Binary Symmetric Channel (BSC) as shown in figure. It is a channel that
transport 1’s and 0’s from transmitter (Tx) to the receiver (Rx). It makes an error
occasionally, with probability p. A BSC flips 1 to 0 and vice-versa with equal probability.
Let
and
be binary random variables that represent the input and output of this BSC
respectively. Let the input symbols are equally likely and the output symbols depend upon
the input according to channel transition probabilities as given below:
3|Page
College Of Information Technology – Information Theory
Conditional Self-Information
The conditional self-information of the event
(
)
(
(
)
given
is defined as:
)
Thus we may write
(
)
( )
(
)
The conditional self-information can be interpreted as the self-information about the event
. Recall that both ( )
on the basis of the event
Therefore,
(
(
)
when
( )
(
) and
(
and
)
(
)
when
.
( )
). Hence, mutual information can be positive, negative or zero.
4|Page
College Of Information Technology – Information Theory
Average mutual information and entropy
So far we have studied the mutual information associated with a pair of events
which are the possible outcomes of the two random variable
and
and
. We now want to
find out the average mutual information between the two random variables. This can be
(
obtained simply by weighting
) by probability of occurrence of the joint event and
summing over all possible joint events.
(
)
∑∑ (
For the case when
) (
)
)
(
are statistically independent, (
and
average mutual information between
mutual information is that
∑∑ (
(
)
and
(
)
( )
( )
)
)
, i.e, there is no
. An important property of the average
where equality holds if and only if
and
are
statistically independent.
Average self-information of the random variable
( )
∑ ( ) ( )
∑ ( )
is defined as
( )
When represents the alphabet of possible output letters from source ( ). ( ) represents
the average information per source letter. In this case ( ) is called the Entropy which is
a measure of the uncertainty of a random variable.
5|Page
College Of Information Technology – Information Theory
Download