Uploaded by Yara Abdallah

lecture2

advertisement
CSCI510 Design and Analysis of
Algorithms (Week 2)
Program running time?
The running time (waiting time for user) is
noticeable/important in:
• web search
• database search
• real-time systems with time constraints
Factors that determine
running time of a program
•
•
•
•
•
•
problem size: n
basic algorithm / actual processing
memory access speed
CPU/processor speed
# of processors?
compiler/linker optimization?
Running time of a program or
transaction processing time
• amount of input: n  min. linear increase
• basic algorithm / actual processing  depends
•
•
•
•
on algorithm!
memory access speed  by a factor
CPU/processor speed  by a factor
# of processors?  yes, if multi-threading or
multiple processes are used.
compiler/linker optimization?  ~20%
Running time for a program:
a closer look
CPU
memory access
disk I/O access
time (clock cycles)
Review: Asymptotic Performance
• Asymptotic performance: How does algorithm
behave as the problem size gets very large?
o Running time
o Memory/storage requirements
 Remember that we use the RAM model:
o All memory equally expensive to access
o No concurrent operations
o All reasonable instructions take unit time

Except, of course, function calls
o Constant word size

Unless we are explicitly manipulating bits
Analysis
• Simplifications
 Ignore actual and abstract statement costs
 Order of growth is the interesting measure:
o Highest-order term is what counts


Remember, we are doing asymptotic analysis
As the input size grows larger it is the high order term that
dominates
Upper Bound Notation
• We say InsertionSort’s run time is O(n2)
 Properly we should say run time is in O(n2)
 Read O as “Big-O” (you’ll also hear it as “order”)
• In general a function
 f(n) is O(g(n)) if there exist positive constants c
and n0 such that f(n)  c  g(n) for all n  n0
Lower Bound Notation
• We say InsertionSort’s run time is (n)
• In general a function
 f(n) is (g(n)) if  positive constants c and n0 such
that 0  cg(n)  f(n)  n  n0
• Proof:
 Suppose run time is an + b
o Assume a and b are positive (what if b is negative?)
 an  an + b
Asymptotic Tight Bound
• A function f(n) is (g(n)) if  positive
constants c1, c2, and n0 such that
c1 g(n)  f(n)  c2 g(n)  n  n0
• Theorem
 f(n) is (g(n)) iff f(n) is both O(g(n)) and (g(n))
 Proof: someday
Other Asymptotic Notations
• A function f(n) is o(g(n)) if  positive
•
•
constants c and n0 such that
f(n) < c g(n)  n  n0
A function f(n) is (g(n)) if  positive
constants c and n0 such that
c g(n) < f(n)  n  n0
Intuitively,
 o() is like <
 O() is like 
 () is like >
 () is like 
 () is like =
Exercises
What is the complexity of the
following code fragments?
Coding example #1
for ( i=0 ; i<n ; i++ )
m += i;
Coding example #2
for ( i=0 ; i<n ; i++ )
for( j=0 ; j<n ; j++ )
sum[i] += entry[i][j];
Coding example #3
for ( i=0 ; i<n ; i++ )
for( j=0 ; j<i ; j++ )
m += j;
Coding example #4
while (n > 1) {
tot++;
n = n / 2;
}
Coding example #5
for ( i=0 ; i<n ; i++ )
for( j=0 ; j<n ; j++ )
for( k=0 ; k<n ; k++ )
sum[i][j] += entry[i][j][k];
Coding example #6
for ( i=0 ; i<n ; i++ )
for( j=0 ; j< sqrt(n) ; j++ )
m += j;
Time Complexity
•
•
•
•
measure of algorithm efficiency
has a big impact on running time.
Big-O notation is used.
To deal with n items, time complexity can be
O(1), O(log n), O(n), O(n log n), O(n2), O(n3),
O(2n), even O(nn).
Practical Complexity
250
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Compare running time growth rates
Time Complexity  maximum N?
http://www.topcoder.com/tc?module=Static&d1=tutorials&d2=complexity1
Order the following functions from the fastest to the slowest:
4, x2/3, x5, 2x, log (log(x)), log(x!), (logx)!, x!, xx
Solution
1.xx
2.x!
3.2x
4.x5
5.(log x)!
6.log(x!)
7.x2/3
8.Log (log x)
9.4
Linear Search
public static int linear_search(int[] arr, int k){
for(int i=0; i<arr.length; i++)
if(arr[i]==k)
return i;
return –1;
}
• Best-case complexity is O(1) where the element is found at the first
index.
• Worst-case complexity is O(n) where the element is found at the last
index or element is not present in the array.
• Average case complexity is O(n)
Binary Search
public static int binary_search(int[] arr, int k){
int l=0,h=arr.length-1,mid=0;
while(l<=h){
mid=l+(h-l)/2;
if(arr[mid]==k)
return mid+1;
else if(arr[mid]>k)
h=mid-1;
else
l=mid+1;
}
return –1;
}
• Best-case complexity is O(1) where the element is found at the middle
index.
• The worst-case complexity is O(log2n).
Insertion Sort
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
Upper Bound Notation
• We say InsertionSort’s run time is O(n2)
• We say InsertionSort’s run time is (n)
Download