CIS 5371 Cryptography 3. Probability & Information Theory 1

advertisement
CIS 5371 Cryptography
3. Probability & Information Theory
1
B i rules
Basic
l of
f probability
b bilit
N
Notation
i

Events: S, , E, F, ..., EF, EF, …

.

Pr[S]=1 Pr[]=0,
Pr[S]=1,
Pr[]=0 0  Pr[E]  1

Pr[EF] = Pr[E] + Pr[F] - Pr[EF],

.

.
E  S \ E , Pr[ E ]  Pr [ E ]  1
EF
 Pr[ E ]  Pr[ F ]
Pr[ E  F ]
Pr[ F | E ] 
Pr[ E ]
2
B i rules
Basic
l of
f probability
b bilit
A Experiment
An
E
i
can yield
i ld one off n
equally probable outcomes.

Event E1: One of the n values occurs.

Pr[E1] = 1/n

Event E2: m of the n values occur.

Pr[E2] = m/n

Event E3: An Ace is drawn from a pack of 52 cards

Pr[E3]
[ ] = 4/52 = 1/13
3
B i rules
Basic
l of
f probability
b bilit
Binomial Distributi on :
n k
P [ k succeses in n trials
Pr
l ]    p (1  p ) n  k
 
k 
Bayes ' Law :
Pr[ E ]
Pr[ E | F ]  Pr[ F | E ]
Pr[ F ]
Pr[F
4
Bi thd P
Birthday
Paradox
d
Let f : X  Y where Y is a set of n elements
Event E k , :
for k pairwise distinct values x1 , x 2 ,  , x k the
probabilit y of a collision f ( xi )  f ( x j ), occurs
for some i  j , is at least 
Birthday Paradox : If   1 / 2 then k  1.1774 n
5
I f
Information
ti Th
Theory
Entropy (Shannon)

The entropy of a message source is a measure of the
amount of information the source has

Let L = {a1, … , an} be a language with n letters.

S is a source that outputs these letters with
independent probabilities Prob[a1], … , Prob[an].

The
h entropy off S is:
i
H (S ) 
n

i 1
 1 
 bits
Pr[ a i ] log 2 
 Pr[ a i ] 
6
I f
Information
ti Th
Theory
Example
A source S outputs a random bit.
Its entropy is:
 1 
 1 

  Pr[1] log 2 
H ( S )  Pr[0] log 2 
 Pr[0] 
 Pr[1] 
1
1

(1) 
(1)  1 bit
2
2
7
Download