week1

advertisement
Analyzing algorithms using big-O,
omega, theta
• First, we analyze some easy algorithms.
• Most of them have the same running time for
all inputs of length n.
Example 1
char A[100]
for i = 1..100
print A[i]
• Running time: 100*c = Theta(1) (why?)
• O(1), Omega(1) => Theta(1)
Example 2
char A[n]
for i = 1..n
print A[i]
• Running time: n*c
• O(n), Omega(n) => Theta(n)
Example 3
int A[n]
for i = 1..n
binarySearch(A, i)
• Remember: binarySearch is O(lg n).
• O(n lg n)
Example 4
char A[n]
for i = 1..n
for j = 1..n
print A[i]
• O(n^2), Omega(n^2) => Theta(n^2)
Example 5
char A[n]
for i = 1..n
for j = 1..i
print A[j]
>=
char A[n]
for i = n/2..n
for j = 1..n/2
print A[j]
• Big-O: i is always <= n, so j always iterates up to at most n,
so this is less work than n*n, which is O(n^2).
• Big-Omega:
– Useful trick: consider the iterations of the first loop for which i
>= n/2.
– In these iterations, the second loop iterates from j=1 to at least
j=n/2.
– Therefore, the time to perform just these iterations is at least
n/2*n/2 = (1/4) n^2, which is Omega(n^2).
– The time to perform ALL iterations is even more than this so, of
course, the whole algorithm must take Omega(n^2) time.
Example 6
char A[n]
for i = 1..n
for j = 1..i
for k = 1..j
for l = 1..7
print A[k]
>=
char A[n]
for i = n/2..n
for j = n/4..n/2
for k = 1..n/4
for l = 1..7
print A[k]
• Big-O: i <= n, and j <= n, so running time < n*n*n*7 = c*n^3 = O(n^3).
• Big-Omega:
–
–
–
–
–
–
–
Consider only the iterations of the first loop from n/2 up to n.
For these values of i, the second loop always iterates up to at least n/2!
Consider only the iterations of the second loop from n/4 up to n/2.
For these values of j, the third loop always iterates up to at least n/4!
Consider only the iterations of the third loop up to n/4.
The fourth loop always iterates up to 7.
Total iterations: n/2 * n/4 * n/4 * 7 = (7/32) n^3 = Omega(n^3)
Analyzing algorithms using big-O,
omega, theta
• Now, we are going to perform some slightly
different analysis.
• The next two algorithms’ running times vary
widely for different inputs of length n.
• For some length n inputs, they terminate
very quickly.
• For other inputs n inputs, they can be slow.
Example 1
LinearSearch(A[1..n], key)
for i = 1..n
if A[i] = key then return true
return false
• Might take 1 iteration… might take n...
• Big-O: at most n iterations, constant work per
iteration, so O(n)
• Big-Omega: can you find a bad input that makes
the algorithm take Omega(n) time?
Example 2
• Your boss tells you that your group needs to
develop a sorting algorithm for the company's
web server that runs in O(n lg n) time.
• If you can't prove that it can sort any input of
length n in O(n lg n) time, it's no good to him. He
doesn't want to lose orders because of a slow
server.
• Your colleague tells you that he's been working
on this piece of code, which he believes should fit
the bill. He says he thinks it can sort any input of
length n in O(n lg n) time, but he doesn't know
how to prove this.
Example 2 continued
WeirdSort(A[1..n])
last := n
sorted := false
while not sorted do
sorted := true
for j := 1 to last-1 do
if A[j] > A[j+1] then
swap A[j] and A[j+1]
sorted := false
end if
end for
last := last-1
end while
[Discuss… result is:] O(n^2), Omega(n^2) => Theta(n^2)!
Download