Review from tutorial 1 (MS Word)

advertisement
TUTORIAL 1
Asymptotic notation (10 minutes)
The notations we use to describe the asymptotic running time of an algorithm.
:
 (g(n)) is the set of functions f(n) such that there exist positive constants c1, c2 and n0 such that
0c1*g(n)  f(n)  c2*g(n) for all nn0
 notation bounds a function within constant factors
c2g(n)
f(n)
c1g(n)
n0
Example: 3n2 + 2n + 10 = (n2)
O:
O (g(n)) is the set of functions f(n) such that there exist positive constants c and n0 such that
0 f(n)  c*g(n) for all nn0
O gives an upper bound for a function within a constant factor. In the figure c=c2
Example: 2n = O(n2)
:
 (g(n)) is the set of functions f(n) such that there exist positive constants c and n0 such that
0 c*g(n)  f(n) for all nn0
 gives a lower bound for a function within a constant factor. In the figure c=c1
Example: 3n2 =  (n)
Basic Probability (20 minutes)
1. Sample Space S is the set of all possible outcomes of an experiment.
Example: tossing two coins has S={HH,TH,HT,TT}
TH
HH
HT
TT
Elementary events
Usually we are interested in collections of elementary events, which are called events
At least one head “is observed”
TH
HH
HT
TT
2. Probability is the measurment of likelihood of uncertain events to occur
The probability of an event A, Pr{A}=(# of elementary events in A)/(#elementary events in S)
If the elementary events are equally likely.
Probability axioms that the assignment of probabilities must satisfy:
1. 0Pr{A}1, for every A.
2. Pr{S}=1, an outcome in the sample space is certain to occur.
3. If A1,…, An are mutual exclusive events pairwise: Ai  Aj = 
Then Pr{A1…An } = Pr{A1}+…+Pr{An }
Rules of probabilities
1. Pr{AB } = Pr{A}+Pr{B}-Pr{AB}
A
B
2. Pr{A’}=1-Pr{A}. By probability axioms 2,3
Conditional probabilities
Conditional probability of A given B
Pr{A|B}=Pr{AB}/Pr{B}
X
B X
X
X
X
X
X
X
X A
|S|=9, |A|=4, |B|=6, |AB|=2
Pr{A|B}=(2/9)/(6/9)=2/6
The new sample space is defined by B so intuitively in B only 2 events satisfy A.
Independent events
Pr{AB}=Pr{A}*Pr{B}
Equivalently if Pr{B}0 then Pr{A|B}=Pr{A}
B
Example 2 coin tosses : A = first toss is Head
B = second toss isTail
A
HT
Not Independed events:
A
B
Pr{AB}=0
B
A
Pr{AB}=A
3. Random Variable is a variable whose value is determined in some way by chance.
Example: Number of people in a line at a bank at some point.
Mathematically we think it as a function that maps the points of the sample space into the set of
real numbers.
Example: Two tosses of a fair coin:
Let X be the random variable that denotes the number of heads occurring in the two tosses.
Possible values for X:0,1,2
S
TT
Real
Numbers
TH HT
0
1
HH
2
1. Discrete random variables: all possible values are countable (example before)
2. Continuous random variables. Their values are spread continuously over intervals of real
numbers. Example: atmospheric temperature at noon each day. [0C, 25C]
We define the event X=x for some value x, to be the set of elementary events in S that map to x by
X. : {sS : X(s)=x}
Probability density function is a function that assigns a probability to each of the possible values
of X : fX(x) = Pr{X=x}
Example: Let X be the random variable that denotes the number of heads occurring in the two
tosses.
x
0
1
2
1/4 1/2 1/4
f(x)
Properties:
1. 0  f(x)  1
2. x f(x) = 1
Two random variables X and Y are independent, if for all values x of X and all values y of Y
Pr{X=x and Y=y} = Pr{X=x}*Pr{Y=y}
4. Expected Value: Average value of a random variable E[X] = x x*f(x)
Example: E[X]= x=0 2 x*f(x) = 0*(1/4) + 1*(1/2) + 2*(1/4) = 1
This implies that there will be 1 head, on average, in 2 tosses of a fair coin.
Properties:
1. E[X+Y] = E[X] + E[Y]
2. E[a*X] = a*E[X], for any constant a
3. If X, Y are independent random variables, then E[XY]=E[X]E[Y]
Basic Recurrences (20 minutes)
A recurrence is am equation or inequality that describes a function in terms of its value on smaller
inputs.
Example 1: T(n) = T(n-1) + f(n) for some function f(n) that
has a closed-form sum, where T(0) = some constant.
Iteration method: Recurrences of the above form can be solved using the iteration method. This
means that we simply unwind the recursion: since T(n) = T(n-1) + f(n), it must be the case that
T(n-1) = T(n-2) + f(n-1). Then we can substitute for T(n-1) in the first equation:
T(n) = (T(n-2) + f(n-1)) + f(n). We can continue this until we just have T(0) on the right side.
So T(n) = T(0) + f(1) + f(2) + … + f(n). Since f(n) has a closed-form sum, the right hand side can
be simplified.
Master method provides a way for solving recurrences of the form T(n)=aT(n/b)+f(n), a1,b>1,
f(n) is an asymptotically positive function.
This is the running time of an algorithm that divides a problem of size n into a subproblems, each
of size n/b. each of the a subproblems are solved recursively each in time T(n/b).
Example 2: State the master theorem without proof and give
an example like T(n) = 2T(n/2) + O(n) (20 minutes)
Download