ELEC444 Tut 1 solution

advertisement
ELEC 444 – Digital Communications, Tutorial 1, February 2018
Problem 1
Suppose that equal numbers of letter grades A, B, C, D, F are given in a certain class. How many
information in bits you have received if the instructor has told you that your grade is not an F? How
much information you still need to determine your grade?
Solution:
Information in bits you have received if the instructor has told you that your grade is not an F:
or
Pnot F
Inot F   log 2 ( Pnot F )
 1  PF  1  1 5  4 5 , then
I not F   log 2  4 5   0.322 bits
To know your grade, 4 choices are left:
I needed   log 2 ( Pgrade )   log 2 (1 / 4)  2 bits
Problem 2.
Consider a random variable X with alphabet X = {a1, a2, a3, a4} and probabilities PX(a1) = 1/2, PX(a2)
= 1/4, PX(a3) = 1/8, PX(a4) = 1/8. What is the entropy of this random variable? Suppose independent
trials of the random variable occur at rate r = 100 trials/second. What is then the rate of the source.
Can you devise a coder that exactly achieves the rate of the source?
Solution:
In general, the entropy H(X) of a random variable X is defined as
H(X )  

x X
PX ( x )log 2  PX ( x ) 
So, now we can write the solution directly as
1
1
1
1
log 2 (2)  log 2 (4)  log 2 (8)  log 2 (8)
2
4
8
8
1 2 3 3
   
2 4 8 8
4 4 3 3
   
8 8 8 8
14

8
 1.75 bits/symbol
H(X ) 
where we have used the facts that logy(yx) = x and logy(1/x) = –logy(x).
Then, the rate of the source is given by
R  rH ( X )  175 bits/sec
Following encoding "rule" will achieve the entropy limit (and is also easy to decode since no
codeword is a prefix of some other codeword):
Outcome
a1
a2
a3
a4
Probability
1/2
1/4
1/8
1/8
Codeword
0
10
110
111
The average number of bits produced by the coder is now
1
1
1
1
 1   2   3   3  1.75 bits/sec  H ( X )
2
4
8
8
meaning that we indeed achieved the entropy limit. It is, however, not obvious when it is possible to
reach the entropy limit. This issue will be considered in the Problem 2.
Problem 3
Consider an unfair coin that produces heads with probability 1/4.
1. What is the entropy of the coin flip outcome?
2. Suppose the coin is flipped once per second. What is then the rate of this source?
3. Devise a coder to encode successive coin flip outcomes so that the average number of bits per
flip is less than one.
4. How does your encoder compare with the rate of the source?
Solution:
1. The entropy of the unfair coin flip outcome is given by
H ( X )  0.25 log 2  0.25   0.75 log 2  0.75   0.81 bits .
2. When the coin is flipped once per second, the rate of the source is
R  rH ( X )  0.81 bits / sec .
3. If we encode each outcome separately (with bits 0 and 1), the average number of bits produced
by the coder is 0.25 1  0.75 1  1. We can also have a more efficient coding scheme
that always takes pair of outcomes and encodes them. For example, the following encoder
will have smaller average number of bits per flip than the previous trivial coder. Here T = tail,
H = head, and P(T) = 3/4, P(H) = 1/4 and consecutive flips are assumed independent.
Outcome
Probability
Codeword
TT
9/16
0
TH
3/16
10
HT
3/16
110
HH
1/16
111
So, the average number of bits produced by the coder is
9
3
3
1
27
1   2   3   3 
 1.6875 bits/pair
16
16
16
16
16
meaning that the average number of bits per flip is then
1.6875
 0.8438 bits .
2
This is, of course, still above the entropy of the source but already much closer to it than the trivial
coding efficiency.
Problem 4
Suppose a source emits r = 2000 Symbols/Sec from an alphabet of size M = 4 with symbol
probability and codewords listed in the follwing table:
xi
Pi
Codewords
A
1/2
0
B
1/4
1
C
1/8
10
D
1/8
11
1. Calculate the infomation rate.
2. The average code length of the source and the Kraft factor ?
3. Draw a conclusion ?
Solution:
The Source information rate is equal: R = r H(X)
1
1
1
1
log 2 (1 / 2)  log 2 (1 / 4)  log 2 (1 / 8)  log 2 (1 / 8)
2
4
8
8
 1.75bits/sym
H(X ) 
R = 2000 x 1.75 = 1750 bits/sec.
The average code length (average number of binary digits per source symbol) is defined by:
rb M
N    Pi N i ,
r i1
where rb is the signalling rate, r is the symbol rate, and Ni is the codeword length for the ith symbol.
N  1/ 2  1  1/ 4  1  1/ 8  2  1/ 8  2  1.25  H ( X ) .
•
For uniquely decipherable binary code the word length Ni must satisfy
M
K   2  Ni  1
i 1
Kraft Inequality.
Kraft Factor:
K  2 1  2 1  2 2  2 2  1.5  1 .
•
The result N  H (X ) is meaningless because K > 1, which tells us that this code is not
uniquely decipherable. For example, the code sequence 10011 could be decoded as BAABB or
CABB or CAD.
Download