+n - LAMDA

advertisement
Recurrence Equations
Algorithm : Design & Analysis
[4]
In the last class…




Recursive Procedures
Analyzing the Recursive Computation.
Induction over Recursive Procedures
Proving Correctness of Procedures
Recurrence Equations


Recursive algorithm and recurrence
equation
Solution of the Recurrence equations




Guess and proving
Recursion tree
Master theorem
Divide-and-conquer
Recurrence Equation: Concept

A recurrence equation:



Example: Fibonacci numbers



defines a function over the natural number n
in term of its own value at one or more integers
smaller than n
Fn=Fn-1+Fn-2
F0=0, F1=1
for n2
Recurrence equation is used to express the cost
of recursive procedures.
Linear Homogeneous Relation
an  r1 an1  r2 an2  rm ank
is called linear homogeneous relation of degree k.
cn  (2)cn1
f n  f n1  f n2
Yes
an  an1  3
gn  g
2
n1
 gn2
No
Characteristic Equation

For a linear homogeneous recurrence relation of degree k
an  r1an1  r2 an2  rm ank
the polynomial of degree k
x k  r1 x k 1  r2 x k 2  rk
is called its characteristic equation.

The characteristic equation of linear homogeneous
recurrence relation of degree 2 is:
x 2  r1 x  r2  0
Solution of Recurrence Relation

2
If the characteristic equation x  r1 x  r2  0 of the
recurrence relation an  r1an1  r2 an2 has two distinct
roots s1 and s2, then
an  us1n  vs2n
where u and v depend on the initial conditions, is the
explicit formula for the sequence.

If the equation has a single root s, then, both s1 and s2 in
the formula above are replaced by s
Fibonacci Sequence
f1=1
1, 1, 2, 3, 5, 8, 13, 21, 34, ......
f2=1
fn= fn-1+ fn-2
Explicit formula for Fibonacci Sequence
The characteristic equation is x2-x-1=0, which has roots:
1 5
1 5
s1 
and s2 
2
2
Note: (by initial conditions) f1  us1  vs2 1 and f 2  us  vs2 1
2
1
which results:
n
1  1 5 
1  1 5 




fn 

5  2 
5  2 
n
2
Determining the Upper Bound


Example: T(n)=2T(n/2) +n
Try and fail to prove T(n)cn:
Guess
T(n) = 2T(n/2)+n
T(n)=2T(n/2)+n
 2c(n/2)+n



2(cn/2 lg
2c(n/2)+n
= (n/2))+n
(c+1)n
 cn lg (n/2)+n
 T(n)cn, to be proved for c large enough
= cn lg n – cn log 2 +n
T(n)O(n2)?
= cn lg n – cn + n
 cn
n enough
for c1
 T(n)cn2, to be proved for
c log
large
T(n)O(n)?
Or maybe, T(n)O(nlogn)?

T(n)cnlogn, to be proved for c large enough
Recursion Tree
T(size)
T(n)
T(n/2)
T(n/4)
n/4
nonrecursive cost
n
T(n/2)
n/2
T(n/4)
n/4
T(n/4)
n/4
n/2
T(n/4)
The recursion tree for T(n)=T(n/2)+T(n/2)+n
n/4
Recursion Tree Rules

Construction of a recursion tree



work copy: use auxiliary variable
root node
expansion of a node:
recursive parts: children
 nonrecursive parts: nonrecursive cost


the node with base-case size
Recursion tree equation

For any subtree of the recursion tree,


size field of root =
Σnonrecursive costs of expanded nodes +
Σsize fields of incomplete nodes
Example: divide-and-conquer:

T(n) = bT(n/c) + f(n)
After kth expansion:
k 1
n


k
i  n 
T ( n)  b T  k    b f  i 
 c  i 0
c 
Evaluation of a Recursion Tree



Computing the sum of the nonrecursive
costs of all nodes.
Level by level through the tree down.
Knowledge of the maximum depth of the
recursion tree, that is the depth at which
the size parameter reduce to a base case.
Recursion Tree
Work copy: T(k)=T(k/2)+T(k/2)+k
T(n)
T(n/2)
T(n)=nlgn
n
T(n/2)
n/2
n/2
n/2d
(size 1)
T(n/4)
n/4
T(n/4)
n/4
T(n/4)
n/4
At this level: T(n)=n+2(n/2)+4T(n/4)=2n+4T(n/4)
T(n/4)
n/4
2)
T(n)=3T(n/4)+(n
Recursion Tree for
cn2
cn2
c(n/4)2
c(n/4)2
3
cn 2
16
c(n/4)2
log4n
2
 3
2
  cn
 16 
c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2 c(n/16)2
……
……
…
T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1)
Note:
3log4 n  n log4 3
T(1) T(1) T(1)

 nlog4 3
Total: O(n2)

Verifying “Guess” by Recursive Tree
T ( n) 
log 4 n 1

i 0

i
3 2
log 3
  cn  (n 4 )
 16 
i
3 2
    cn  (n log4 3 )
i  0  16 
1

cn2  (n log4 3 )
3
1  
 16 
16 2
 cn  (n log4 3 )  O(n 2 )
13
T (n)  3T ( n / 4)  cn2
 3d n / 4  cn2
2
 3d (n / 4) 2  cn2
3 2
2
 dn  cn
16
 dn
2
16
when d  c
13
Inductive hypothesis
Common Recurrence Equation

Divide and Conquer


Chip and Conquer


T(n) = bT(n/c) + f(n)
T(n) = T(n - c) + f(n)
Chip and Be Conquered

T(n) = bT(n - c) + f(n)
Recursion Tree for T(n)=bT(n/c)+f(n)
f(n)
f(n)
b
f(n/c)
f(n/c)
f(n/c)
bf (n / c)
b
logcn
f(n/c2) f(n/c2) f(n/c2) f(n/c2) f(n/c2) f(n/c2) f(n/c2) f(n/c2) f(n/c2)
……
b 2 f (n / c 2 )
……
…
T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1) T(1)
Note:
b logc n  n logc b
T(1) T(1) T(1)

 nlogc b
Total ?

Solving the Divide-and-Conquer


The recursion equation for divide-andconquer, the general case:T(n)=bT(n/c)+f(n)
Observations:



Let base-cases occur at depth D(leaf), then
n/cD=1, that is D=lg(n)/lg(c)
Let the number of leaves of the tree be L, then
L=bD, that is L=b(lg(n)/lg(c)).
By a little algebra: L=nE, where E=lg(b)/lg(c),
called critical exponent.
Lb
lg n
lg c
 2lgb
lg n
lg c
2
lg n
lgb
lg c
2
lg n
lgb
lg c
Divide-and-Conquer: the Solution




The recursion tree has depth D=lg(n)/ lg(c), so
there are about that many row-sums.
The 0th row-sum is f(n), the nonrecursive cost of
the root.
The Dth row-sum is nE, assuming base cases
cost 1, or (nE) in any event.
The solution of divide-and-conquer equation is
the nonrecursive costs of all nodes in the tree,
which is the sum of the row-sums.
Little Master Theorem

Complexity of the divide-and-conquer



case 1: row-sums forming a geometric series:
 T(n)(nE), where E is critical exponent
case 2: row-sums remaining about constant:
 T(n)(f(n)log(n))
case 3: row-sums forming a decreasing
geometric series:
 T(n)(f(n))
Master Theorem

The positive  is critical,
resulting gaps between
cases as well
Loosening the restrictions on f(n)



Case 1: f(n)O(nE-), (>0), then:
T(n)(nE)
Case 2: f(n)(nE), as all node depth
contribute about equally:
T(n)(f(n)log(n))
case 3: f(n)(nE+), (>0), and f(n)O(nE+),
(), then:
T(n)(f(n))
Using Master Theorem
n
Exam ple1 T (n)  9T    n
 3
b  9, c  3, E  2, f (n)  n  O(n E 1 ),
case1 applies, T (n)  (n 2 )
 2n 
Exam ple2 T (n)  T    1
 3 
3
b  1, c  , E  0, f (n)  1  (n 0 ), case 2 applies, T (n)  (lg n)
2
n
Exam ple3 T (n)  3T    n lg n
 4
b  3, c  4, E  log4 3  0.793, f (n)  n lg n  (n E  0.21 )  O(n E 1.21 )
case 3 applies, T (n)  (n lg n)
Looking at the Gap

T(n)=2T(n/2)+nlgn




a=2, b=2, E=1, f(n)=nlgn
We have f(n)=(nE), but no >0 satisfies
f(n)=(nE+), since lgn grows slower that n for
any small positive .
ln n
lim   0 for any   0
So, case 3 doesn’t apply. n n
However, neither case 2 applies.
Proof of the Master Theorem
Case 3 as an example
T ( n) 
lg( n ) / lg( c )

d 0
 n
f d
c
n


 cd
T ( n) 
E 
 
E 
lg( n ) / lg( c )

d 0
 n 
b f d 
c 
d

f n 
c
(Note: in asymptotic analysis,
f(n)(nE+) leads to f(n) is
about (nE+), ignoring the
coefficients.
Ed d
d  f n 
b  Ed d
c c
Decreasing geo. series



lg( n ) / lg( c )

d 0
 f ( n) 
 d 
 c 
Home Assignment

pp.143



3.7
3.8
3.9
3.10
Download