Document 10622832

advertisement
Fixed Points and Cycles in Random Permuta5ons Eli Ross March 1, 2013 18.304: Seminar in Discrete Math Introduc?on •  Permuta(on: A bijec?ve mapping σ
from a set of n elements to itself •  Fixed Point: Any value such that σ ( k ) = k.
•  Cycle: An ordered tuple of values ( a1, a2 ,…, ak )
such that σ ( a1 ) = a2 , σ ( a2 ) = a3 ,…, σ ( ak ) = a1 .
Secret Santa Outline •  Permuta?ons with k fixed points •  Distribu?on/limi?ng distribu?on for the number of fixed points •  Moments of number of fixed points •  Sta?s?cs about cycles •  A glimpse of some applica?ons No Fixed Points? A permuta?on with no fixed points is called a derangement. How many derangements of n elements are there? k
n
−1)
(
!n = ( n!) ⋅
k=0 k!
Why? ∑
How Many Sad People? •  Well, we just found the probability of no sad people. For large n, it’s about 1/e. •  For exactly k sad people, we need k fixed points and a derangement on the other (n-­‐k) points: ⎛ n ⎞
n!⋅ ( n − k )!
−1)
−1)
n!
(
(
⎜⎝ k ⎟⎠ ⋅!( n − k ) = ( n − k )!k! ⋅ ∑ j! = k! ⋅ ∑ j!
j=0
j=0
n−k
j
n−k
j
Limi?ng Distribu?on We know that, for sufficiently large n, j
n−k
( −1) → e−1.
∑
j!
j=0
Hence, j
n−k
−1)
n!
n!
(
⋅∑
→
.
k! j=0 j!
e ⋅ k!
Poisson! In other words, the probability that a random permuta?on of n elements contains exactly k fixed points approaches 1 n!
1
⋅
=
.
n! e ⋅ k! e ⋅ k!
1k ⋅ e −1
Does this look familiar? How about ?
k!
Order Sta?s?cs The expected value for the number of fixed points is simply: ⎛ n! n−k ( −1) j ⎞
∑
⎜ k ⋅ k! ⋅ ∑ j! ⎟
⎠
k=0 ⎝
j=0
n
which isn’t par?cularly appealing at first glance (or second glance, or third). Linearity of Expecta?on The probability that σ ( k ) = k is simply 1 n,
so the expected value for the number of fixed points is: 1 + + 1 = 1.
n

n
n times
Variance We can compute: 2
⎡
E ⎡⎣ X ⎤⎦ = E ( X1 +… + X n ) ⎤
⎣
⎦
2
n
= ∑ E ⎡⎣ X 2j ⎤⎦ + 2
j=1
∑
1≤i< j≤n
(
E ⎡⎣ Xi X j ⎤⎦
)
= n ⋅ E ⎡⎣ X ⎤⎦ + n − n ⋅ E [ X1 X 2 ]
2
1
2
What About Higher Order Sta?s?cs? •  Now we know that the variance for the number of fixed points is 1. •  What about higher order sta?s?cs, which could be used to help compute skewness/
kurtosis/etc.? •  Let’s consider, for k < n,
k
k
E ⎡⎣ X ⎤⎦ = E ⎡( X1 +… + X n ) ⎤
⎣ ⎦
kth moment k
⎡
⎤
E
X
=
B
where
k
≤
n
k
⎣
⎦
where B
k are the Bell Numbers: 1, 1, 2, 5, 15, 52, 203, 877, … They grow very quickly. Cycles! •  We might also be curious to inves?gate what sorts of cycles appear in such permuta?ons. •  How many ?mes do we expect to see two people simple exchanging gibs? •  How likely are we to have a cycle of size k? •  How many cycles do we expect to have? •  And more… Expected Number of k-­‐cycles Let C be a k-­‐cycle. Then: n ( n −1)( n − k +1) ( n − k )! 1
E [1C ] =
⋅
= .
∑
k
n!
k
σn
Poisson Again! It’s a bit trickier to show, but for large n, the number of k-­‐cycles approaches a Poisson distribu?on with parameter 1 k .
Expected Number of Cycles Since we found that the expected number of k-­‐
cycles was 1 k , it follows simply that the expected number of cycles is: n
1
= H n ≈ log n.
∑k
k=1
Probability that YOU’RE in a k-­‐cycle • 
⎛ n −1 ⎞
There are ⎜⎝ k −1
⎟⎠ ⋅ ( k −1
) ! cycles of length k that contain you. •  There are ( n − k ) ! ways to arrange the rest of the permuta?on. •  Hence, the probability is: ⎛ n −1 ⎞
1
1
⋅⎜
⋅ ( k −1)!⋅ ( n − k )! = .
⎟
n! ⎝ k −1 ⎠
n
What’s the expected cycle size? Is your friend in the same cycle as you? •  It’s a coin flip; there’s a ½ chance that any two dis?nct elements will be in the same cycle. I’ll leave this as an exercise, but it’s similar to what we’ve been doing. •  Furthermore, the probability that any m elements all are in the same cycle is 1/m. Other Applica?ons While I won’t go into too much detail… Analogous Problems •  Secret Santa •  Mathema?cian’s Hats •  Room swapping Quickselect Quickselect is a similar sor?ng method to quicksort. Like quicksort, it is good in prac?ce. It involves choosing a random element, splikng the halves around it, and doing sorts on the halves. How good? It depends on how “mixed-­‐up” the data is. This has to do with cycles/fixed-­‐points. Turbo Codes •  These are really good coding schemes which transmit over noisy channels and get close to the (theore?cal) channel capacity. •  Uses random permuta?ons; might want certain proper?es to do bener – possibly pseudorandom or even nonrandom. Final Remarks •  There are nicer proofs of many things in this talk using exponen?al genera?ng func?ons. •  There are cooler implica?ons of these results with regard to encoding structures. I might talk about this for my next talk or for my paper. END 
Download