Lecture 4

advertisement
ANALYSIS OF ALGORITHMS
Classifying functions by theirs growth rates.
Consider two algorithms A and B, to solve the same problem (e.g. sorting) and we want
to compare them:
WA(n)=2n, and WB(n)=9n, which one is better? As we are considering the number of
basic operations, we know that to obtain the total number of operations, we should
multiply WA(n) by a constant c, and WB(n) by a constant c'. Since we do not know these
constant, we don't know which one is more efficient.
Next suppose WA(n)=n3 /2 and WB(n)=5n2 , which one is better, we know that for small
inputs A is better, but for large inputs B is more efficient.
Thus form the previous examples we conclude that in order to classify functions, we can
ignore constants and small inputs.
We can classify algorithms using "asymptotic growth rate", where small inputs and
constant do not play a role.
Let f(n) be a real function on the natural numbers. Consider the universe of all real
functions on the natural numbers (infinite many), and we will group these functions with
respect to f(n).
(f)
Functions that
grow at the
same rate as
f(n)
Functions that
grow faster or
equal to f(n)
(f)

(f)
Functions that
grow slower or
equal to f(n)
As an example suppose that f(n)=n2, one function that belongs to (f(n)) is g(n)= n3 (as
n3 grows faster than n2 ), and one function that belongs to (f(n)) is h(n)=n (as n grows
slower than n2).
Let
N={0,1,2,3………) be the set of all non-negative integers,
R+ be the set of all positive integers, and
R* be R+  {0} {i.e. the set of all non-negative integers}.
For algorithms we are dealing with inputs whose size are integers, and the number of
basic operations are non-negative reals.
Let f: N R* be a function from the non-negative integers to the non-negative reals, and
g: N R* , be another function.
We say that g(n)  O(f) (Big-O) if
lim
n 
g ( n)
 c, c  R *
f ( n)
That is, O(f) is the set of all functions g(n), such that they do not grow faster than f(n).
Example: Show that n2 - n is in O(n2) .
If that is the case then
lim
n 
n2  n
n2
 lim
n 
n2
n2
 lim
n 
n
n2
 1  0  1  R*
The idea behind O(f), is that let say for example when calculating the worst-case W(n) of
an algorithm, we find another function f(n), that takes more or equal number of basic
operations than W(n), then we know for certain that for inputs of size n, as n goes to
infinity (large inputs), it will take at most c.f(n) of operations, that is, c.f(n) is an upper
bound on the number of basic operations executed by the algorithm on inputs of size n.
Thus we say that W(n)  O(f).
c.f(n)
W(n)
n
Similarly let f: N R* be a function from the non-negative integers to the non-negative
reals, and g: N R* , be another function.
We say that g(n)  (f) if
lim
n 
g ( n)   


f (n) c, c  R  
That is,  (f) is the set of all functions g(n), such that they grow faster or equal than f(n).
Example: Show that n3 + n is in  (n2).
If that is the case then
lim
n 
n3  n
n2
 lim
n 
n3
n2
 lim
n 
n
n2
 0  
We are not interested in finding (f) as it represent a lower bound on the number of basic
operations, thus it does not tell us what is the worst-case.
Finally, given a function f(n), we define (f)= (f)O(f), that is the set of functions that
grow at the same rate as f(n). Thus we say that g(n)  (f), if
lim
n 
g ( n)
 c, c  R 
f ( n)
That is the limit should never go to 0 or . The idea is that we can simplify the worstcase W(n) with another function which is simpler.
Example: Show that n3 /4+ n2 - n is in (n3). We must show that the limit goes to a
positive real constant.
lim
n 
n3 / 4  n 2  n
n3
 lim
n 
n3
4n 3
 lim
n 
n2
n3
 lim
n 
n
n3

1
1
 0  0   R
4
4
The idea about  is that for example let's say that W(n)= n3 /4+ n2 - n, then we say that
W(n) is "of order" (n3), and as n3 is easy to remember than n3 /4+ n2 - n, thus we are
simplifying the complexity.
Some tricks to know:
L' Hopital's Rule:
Suppose that
lim g (n)  lim f (n)  
n
n
Then
lim
n 
g ( n)
g ' ( n)
 lim
f ( n) n  f ' ( n)
Assuming g'(n) and f'(n) exist.
Example:
Show that n log2 n  O(n2),
First we know that for any base a , (loga x)' = (1/x) loga e
Thus
lim
n 
n log 2
n2
1
. log 2 e
n
log 2 n
n
 lim
 lim
0
n 
n 
n
1
Download