Mae 3020 Assignment 9 Problems and solutions 1. (6.1 in the textbook) Given the following conditional probability density functions of an observation Y f Y / H 0 ( y / H 0 ) 1, 0y1 and f Y / H1 ( y / H 1 ) 2 y , 0 y 1 (a) Derive the MAP decision rule assuming P(H0) = ½ (b) Find the average probability of error. S: (a) The likelihood function is: f Y / H1 ( y / H1 ) 2 y L( y ) fY / H0 ( y / H 0 ) 1 and the threshold is: P( H 0 ) 1 / 2 1 P( H 1 ) 1 / 2 Therefore, the MAP decision rule is: H1 H1 2 y 1 , or y 1 / 2 H0 H0 (b) Pe = P(H0)P(D1 / H0) + P(H1)P(D0 / H1) 1 P(D1 / H0) = P(y < ½ / H0) = 1dy 1/ 2 2 P(D0 / H1) = P(y < ½ / H1) = 1/ 2 2 ydy 1 4 Hence, Pe = (1/2)(1/2) + (1/2)(1/4) = 3/8 2. (6.2 in the textbook) Suppose that we want to decide whether or not a coin is fair by tossing it eight time and observing the number of heads showing up. Assume that we have to decide in favor of one of the following two hypotheses: H0: fair coin, P(head) = 1/2 H1: unfair coin, P(head) = 0.4 (a) Derive the MAP decision rule assuming P(H0) = ½ (b) Find the average probability of error. S: (a) Tossing a coin can be described by binomial distribution: 8 f(x / H0) = b(8, 0.5) = 0.5 x (1 0.5) 8 x x 8 f(x / H1) = b(8, 0.4) = 0.4 x (1 0.4) 8 x x Hence, the likelihood ratio is: 8 x 0.4 (1 0.4) 8 x f Y / H1 ( y / H 1 ) x 0.4 x 0.68 x L( y ) fY / H0 ( y / H 0 ) 8 x 0.58 8 x 0.5 (1 0.5) x The threshold is: P( H 0 ) 1 / 2 1 P( H 1 ) 1 / 2 Therefore, the MAP decision rule is: H1 x 8 x 8 0.4 0.6 (0.5) H0 (b) Pe = P(H0)P(D1 / H0) + P(H1)P(D0 / H1) P(D1 / H0) = P(3 or less heads / H1) 8 8 8 8 = (0.5) 3 (0.5) 5 (0.5) 2 (0.5) 6 (0.5)1 (0.5) 7 (0.5) 0 (0.5) 8 3 3 3 3 = 0.3633 P(D1 / H0) = P(4 or more heads / H0) = 0.4059 Hence, Pe = (1/2)(0.3633) + (1/2)(0.4059) = 0.3846 3. (6.3 in the textbook) In a radar system, decision about the presence (H1) or absence (H0) of a target is made on the basis of an observation Y that has the following conditional probability density functions: y2 y , y > 0 f ( y / H0 ) exp N0 2N0 ( y A) 2 y , y > 0 f ( y / H1 ) exp 2AN 0 2 N 0 Hence, the likelihood function is: L( y ) f Y / H1 ( y / H 1 ) fY / H0 ( y / H 0 ) y2 y exp N0 2N0 ( y A) 2 y exp 2AN 0 2 N 0 or: L( y ) 2 Ay A2 N0 exp 2Ay 2N0 The threshold is: P( H 0 ) 0.999 999 P( H 1 ) 0.001 Therefore, the decision rule is: 2 Ay A 2 H 1 N0 999 exp 2Ay 2N0 H 0 4. (6.4 in the textbook) The following figure shows the conditional pdfs of an observation based on which binary decisions are made. The cost function for this decision problem is: C00 = 1, C01 = C10 = 3, C11 = 1 Assume that P(H0) = 1/3 and P(H1) = 2/3. (a) Derive the decision rule that minimizes the average cost and find the minimum cost (b) Find the average cost associated with the decision rule: H1 y 0 H0 And compare it with the average cost of the decision rule derived in Part (a). 0.25 0.25 fY/H0(y/H0) fY/H1(y/H1) y -6 -2 2 6 S: (a) From the figure, it is seen that if y < - 2, then it must be H0. On the other hand, if y > 2, it must be H1. Therefore, we need only to focus on the region: - 2 < y < 2. Since: f(y / H0) = (2 – y) / 16 f(y / H1) = (2 + y) / 16 The likelihood function is: f Y / H1 ( y / H1 ) 2 y L( y ) fY / H0 ( y / H 0 ) 2 y On the other hand, the threshold is: P( H 0 ) 1/ 3 1 P( H 1 ) 2 / 3 2 Therefore, the decision rule is: H1 1 2 y 2 y H0 2 Since 2 – y 0, the decision rule can be further simplified as: H1 2 y H0 3 Next, let us calculate the minimum cost. C = C00P(D0/H0)P(H0) + C01P(D1/H0)P(H0) + C10P(D0/H1)P(H1) + C11P(D1/H1)P(H1) Since P( D0 / H 0 ) 2/3 6 f ( y / H 0 )dy 2 6 2/3 2 y y6 dy dy 7 / 9 2 16 16 P(D1/H0) = 2/9 P(D0/H1) = 1/18 P(D1/H1) = 17/18 Thus: C = 1(1/3)(7/9) + 3(1/3)(2/9) + 3(2/3)(1/18) + 1(2/3)(17/18) = 1.222 (c) Under the decision rule: H1 y 0 H0 P(D1/H0) = P(D0/H1) = (1/2)(2)(1/8) = 1/8 P(D1/H1) = P(D0/H0) = 1 - 1/8 = 7/8 Hence: C = 1.25 It is seen that the cost is higher than the minimum cost C = 1.222. 5. (6.5 in the textbook) Assume that under hypothesis H1 we observe a signal of amplitude 2 volts corrupted by additive noise N. Under hypothesis H0 we observe only the noise N. The noise is zero-mean Gaussian with a variance of 1/9. That is, we observe Y = 2 + N under H1 and Y = N under H0 with 3 9 f N ( n) exp n 2 2 2 Assuming P(H0) = P(H1) = ½, derive the MAP decision rule and calculate the average probability of error. S. H0: Y = N, N ~ N(0, 1/9) H1: Y = 2 + N, N ~ N(0, 1/9) The likelihood ratio is: 9 2 exp y 2 f Y / H1 ( y / H 1 ) 2 L( y ) fY / H0 ( y / H 0 ) 9 2 exp y 2 The threshold is: P( H 0 ) 1 / 1 1 P( H 1 ) 1 / 2 Therefore, the MAP decision rule is: H 9 1 9 2 exp y 2 y 2 1 2 H 2 0 or: H1 18 y 18 ln( 1) H0 H1 y 1 H0 The error probability is: Pe = P(D1/H0)P(H0) + P(D0/H1)P(H1) = P(D1/H0) because P(D1/H0) = P(D0/H1). Since P(D1/H0) is normal distributed, 1 0 Pe P Z 0.0013 1 9 6. (6.6 in the textbook) With reference to Problem 5, find the decision rule that minimizes the average cost with respect to the cost function: C00 = 0, C01 = C10 = 2, C11 = 1 S: This is rather similar to the previous problem except the cost function. The resulting threshold is: 1 C C 2 10 00 2 1 C C 2 01 11 Therefore, the decision rule is: H1 18 y 18 ln( 2) H0 or: H1 y 1.0385 H0 7. (6.8 in the textbook) In a detection problem, f Y H1 ~ N(-1, 1), f Y H 0 ~ N(1, 1). (a) Assuming P(H0) = 1/3 and P(H1) = 2/3, find the MAP decision rule. (b) With C00 = C11 = 0, C01 = 6, C10 = 3, P(H0) = 1/3 and P(H1) = 2/3, find the decision rule that minimizes the cost and the value of the cost. S: