Lecture 4 - Recurrences.ppt

advertisement
Recurrences
It continues…
Jeff Chastine
Recurrences
• When an algorithm calls itself recursively, its
running time is described by a recurrence.
• A recurrence describes itself in terms of its
value on smaller inputs
• There are four methods of solving these:
substitution, iteration, recursion tree, or
master method
Jeff Chastine
What it looks like
T(n) =
{
Θ(1)
2T(n/2) + Θ(n)
if n = 1
if n > 1
• This is the recurrence of MERGE-SORT
• What this says is that the time involved is 1 if
n = 1,
• Else, the time involved is 2 times calling on
half the size of the array, plus n time to merge
sorted sub-arrays
Jeff Chastine
Substitution
• Similar to induction
• Guess solution, and prove it holds true for
next call
• Powerful method, but can sometimes be
difficult to guess the solution!
Jeff Chastine
Substitution
• Example:
– T (n) = 2T (n/2) + n
– Guess that T (n) = O(n lg n)
• We must prove that T (n)  cn lg n | c > 0
• Assume it holds for n/2 as well
T (n/2) = c(n/2) lg (n/2) + n
// Rewrite using n/2
T (n)  2 (c(n/2) lg (n/2)) + n
// Substitution
= cn lg (n/2)) + n
// Math stuff
= cn (lg n – lg 2) + n
= cn lg n – cn + n
 cn lg n  c  1
Note:  means ‘for all’ Jeff Chastine
Subtleties
• Let T (n) = T (n/2) + T(n/2) + 1
• Assume that T (n) = O(n)
• Then T (n/2) = c(n/2)
T (n)  c(n/2) + c(n/2) + 1
= cn + 1
Which does not imply T (n)  cn
• Here, we’re correct, but off by a constant!
Jeff Chastine
Subtleties
• We strengthen our guess: T (n)  cn – b
T (n)  (c(n/2) – b) + (c(n/2) – b) + 1
= cn – 2b + 1
 cn – b  b > 1
Jeff Chastine
One Last Example
Jeff Chastine
Iteration Method
•
•
•
•
•
Expand out the recurrence
T(n) = 3T(n/4) + n
T(n) = 3(3T(n/16) + (n/4)) + n
T(n) = 3((3T(n/64) + 3(n/16))+ (n/4))) + n
T(n) < n + 3n/4 + 9n/16 + 27n/64 + …
∞
( )
nΣ
i=0
3
4
i
1
= n 1 - 3/4
=n
1
1
4
Jeff Chastine
= 4n
Recursion Tree Method
• A recursion tree is built
• We sum up each level,
• Total cost = number of levels * cost at each
level
• Usually used to generate a good guess for the
substitution method
• Could still be used as direct proof
• Example: T (n) = 3T (n/4) + (n2)
Jeff Chastine
T(n)
Jeff Chastine
cn2
T(n/4)
T(n/4)
Jeff Chastine
T(n/4)
cn2
c(n/4)2
T(n/16) T(n/16) T(n/16)
c(n/4)2
c(n/4)2
T(n/16) T(n/16) T(n/16) T(n/16) T(n/16) T(n/16)
Jeff Chastine
cn2
c(n/4)2
c(n/16)2 c(n/16)2 c(n/16)2
c(n/4)2
c(n/4)2
c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2
T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1)T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1)
Jeff Chastine
cn2
cn2
c(n/4)2
c(n/16)2 c(n/16)2 c(n/16)2
c(n/4)2
c(n/4)2
3/16
cn2
c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2(3/16)2cn2
T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1)T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) (nlog43)
Jeff Chastine
Questions
• How many levels does this tree have?
• The subproblem at depth i is n/4i
• When does the subproblem hit size 1?
– n/4i = 1
– n = 4i
– lg4n = i
• Therefore, the tree has lg4n levels
• The cost at each level is 3ic(n/4i)2
• The last level has 3log4n nodes = nlog43
Jeff Chastine
The Master's Method
When it has this form:
T(n) = aT(n/b) + f(n)
• If f (n) = Ο(nlogb a-ε) for some constant ε>0,
then T (n) = Θ (nlogb a)
• If f (n) = Θ(nlogb a), then T (n) = Θ (nlogb algn)
• If f (n) = Ω (nlogb a+ε) for some constant ε>0,
then T (n) = Θ (f (n))
Jeff Chastine
• T(n) = 9T(n/3) + n
– a = 9, b = 3 thus nlogb a = nlog3 9= n2
• T(n) = T(2n/3) + 1
– a = 1, b = 3/2 thus nlogb a = nlog3/2 1 = n0 = 1
• Problem:
– T(n) = 2T(n/2) + nlgn
Jeff Chastine
Download