PROBABILITY OF ERROR – BLOCK ERROR RATE Consider the bit stream illustrated below. Each bit has a probability of error p, i.e. probability that a transmitted ‘0’ is received as a ‘1’ and a transmitted ‘1’ is received as a ‘0’. This probability, sometimes called the single bit error rate or bit error b.e.r, is independent of any other bits being in error or not being in error. For example, if p = 0.1 , the probability that any single bit is in error is ‘1 in 10’ or 0.1. If there were 5 consecutive bits in error, the probability that the 6th bit will be in error is still 0.1, i.e. it is independent of the previous bits in error. Another example is tossing a coin, with a ‘50/50’ chance of heads or tails, i.e the probability of head =0.5 and the probability of tails = 0.5 (since there are only two equally likely outcomes). If 5 heads in a row occurred, the probability of ahead on the 6 th throw is 0.5. Of course the probability of 5 heads (or tails) in a row is very small ((0.5)5), but given that 5 have occurs, the probability of 6 in a row, i.e. that the next throw is heads is 0.5. Thus it is assumed that the probability of any bit in error is independent of the other bits and errors. In a real situation this may not always be the case since error often tend to occur in bursts i.e. burst errors. Assuming AWGN, errors may be assumed to be independent. Within the bit stream shown above there is information, usually in the form of message blocks. The first problem is to find where each message block starts, and then recover the information from what appears to be a random sequence of 1’s and 0’s. Consider a typical message block below. Error Control Coding Data Information 1 Address bits Synchronization bit pattern The first requirement for the receiver/decoder is to identify the synchronization pattern (SYNC) in the received bit stream and then the address and data bits etc may be relatively easily extracted. However, since the received bits are subject to errors, the sync’ pattern may not be found exactly. Error control coding (ECC), i.e. error detection and correction cannot be applied to the SYNC bits (normally) because sync is required for this process to be performed. When synchronization is achieved, the EC bits which apply to the ADD (address) and DATA bits need to be carefully chosen in order to achieve a specified performance. In order to begin to clarify the synchronization and ECC requirements, it is necessary to understand the block error rates, e.g. the probability of no errors, 1 error, 2 errors etc. For example, what is the probability of three errors in a 16 bit block if the b.e.r is p = 10-2? Also for example – what is the probability of 3 heads in any order (or in a row) if a coin is tossed 6 times? Block Error Rate Let N be number of bits in a block. Consider N=3 block. Probability of error = p , (denote by Good , G) Probability that a bit is not in error = (1-p), denote by Error, E Consider now For no errors, i.e an error free block, require ,G G G i.e, Good, Good and Good. Let R= the number of errors, in this case R=0. Hence we may write 2 Probability of error free block = Probability that R=0 or P(R=0) = P(0) = P(Good, Good and Good) Since probability of good = (1-p) and probability are independent so, P(0)= p(G and G and G) = (1-p). (1-p). (1-p)= (1-p)3 P(0) = (1-p)3 For 1 error in any position , i.e Probability of one error P(R=1) = P(1) E G G or G E G or G G E Pr ob( E and G and G ) Pr ob(G and E and G ) Pr ob(G and G and E ) i.e. P(1) = p(1-p) (1-p) + (1-p) p (1-p) + (1-p) (1-p) p P(1) = 3 p (1-p)2 Note that there are three ways of getting error in 3 bits For 2 errors in combination i.e. Probability of two error P(R=2) = P(2) E E G or E G E or G E E Pr ob( E and E and G ) Pr ob( E and G and E ) Pr ob(G and E and E ) P(2) = p p (1-p) + p (1-p) p + (1-p) p p P(2) = 3 p2 (1-p) Note that there are three ways of getting 2 errors in 3 bit block. For 3 errors Probability of three error P(R=3) = P(3) 3 E E E Pr ob( E and E and E) P(3) = p p p = p3 Note only 1 way to get 3 errors in 3 bit block. 4 In general, it may be shown that The probability of R errors in an N bit block subject to a bit error rate p is p( R) N C R p R (1 p) N R N N! Where N C R or is the number of ways getting R R ( N R)! R! errors in N bits To clarify p( R) N C R p R (1 p) N R Prob. of (N-R) good bits Prob. of R bits in error No. of ways getting R errors in N bits Prob. of R errors. Note that R is integer, R =0, 1, 2, 3… N P(R) is said to have a Binomial distribution. To find P (R), we simply substitute R in the equation. For example, for N bit block p (0) N C 0 p 0 (1 p ) N 0 p (1) N C1 p 1 (1 p ) N 1 p (2) N C 2 p 2 (1 p ) N 2 etc (1 p) N p (0) N C N p N (1 p) N N p N Also note that N p( R) 1 i.e. sum of all prob. 1 R 0 pR 1 1 p (0) etc Example An N=8 bit block is received with a bit error rate p=0.1. Determine the probability of an error free block, a block with 1 error, and the probability of a block with 2 or more errors. Prob. Of error free block, 5 p ( R 0) p (0) p (0) 8C 0 p 0 (1 p ) 80 (1 0.1) 8 (0.9) 8 p (0) 0.4304672 Prob. of 1 error, p( R 1) p(1) p(1) 8C1 p 1 (1 p) 81 8 (0.1) (1 0.1) 8 p(1) 0.3826375 Prob. of two or more errors = P(2) + P(3) + P(4)+ ……. P(8) i.e. 8 p( R) R2 It would be tedious to work this out , but since N p( R) 1 then p (0) p(1) p ( 2) 1 R 0 i.e. p ( 2) 1 ( p(0) p (1)) p( 2) (1 (0.4304672 0.3826375)) 0.1868952 Example A coin is tossed to give Heads or Tails. What is the probability of 5 heads in 5 throws? Since the probability of head, say p = 0.5 and the probability of a tail, (1-p) is also 0.5 and N=5 then Prob. of 5 heads p(5) 5C p 5 (1 p) N 5 5C (0.5) 5 5 5 p(5) (0.5) 5 3.125 10 2 Similarly the probability of 3 heads in 5 throws (3 in any sequence) is p(3) 5C3 p 3 (1 p) 53 5C3 (0.5) 3 (0.5) 2 p(3) 0.3125 Similar, reasoning can be applied to many areas, for example to a ‘production line’. 6