08a RunningTime

advertisement
CSCE 2100: Computing Foundations 1
Running Time of Programs
Tamara Schneider
Summer 2013
What is Efficiency?
• Time it takes to run a program?
• Resources
Storage space taken by variables
– Traffic generated on computer network
– Amount of data moved to and from disk
–
2
Summarizing Running Time
Benchmarking
• Use of benchmarks: small collection of typical
inputs
Analysis
• Group input based on size
Running time is influenced by various factors
• Computer
• Compiler
3
Running Time
• worst-case running time: maximum running
time over all inputs of size 𝑛
• average running time: average running time
of all inputs of size 𝑛
• best-case running time: minimum running
time over all inputs of size 𝑛
4
Worst, Best, and Average Case
5
Running Time of a Program
𝑇(𝑛) is the running time of a program as a
function of the input size 𝑛.
– 𝑇(𝑛) = 𝑐𝑛 indicates that the running time is
linearly proportional to the size of the input,
that is, linear time.
6
Running Time of Simple Statements
We assume that “primitive operations” take a
single instruction.
–
–
–
–
–
Arithmetic operations (+, %, *, -, ...)
Logical operations (&&, ||, ...)
Accessing operations (A[i], x->y, ...)
Simple assignment
Calls to library functions (scanf, printf, ... )
7
Code Segment 1
sum = 0;
for(i=0; i<n; i++)
sum++;
1
8
Code Segment 1
sum = 0;
for(i=0; i<n; i++)
sum++;
1
1
9
Code Segment 1
sum = 0;
for(i=0; i<n; i++)
sum++;
1
1 + (n+1)
10
Code Segment 1
sum = 0;
for(i=0; i<n; i++)
sum++;
1
1 + (n+1) + n = 2n+2
11
Code Segment 1
sum = 0;
for(i=0; i<n; i++)
sum++;
1
1 + (n+1) + n = 2n+2
1 How many times?
12
Code Segment 1
sum = 0;
for(i=0; i<n; i++)
sum++;
1
1 + (n+1) + n = 2n+2
1 How many times?
1 + (2n+2) + n*1 = 3n + 3
Complexity?
13
Code Segment 2
sum = 0;
for(i=0; i<n; i++)
for(j=0; j<n; j++)
sum++;
14
Code Segment 2
sum = 0;
1
for(i=0; i<n; i++)
for(j=0; j<n; j++)
sum++;
15
Code Segment 2
sum = 0;
1
for(i=0; i<n; i++)
2n+2
for(j=0; j<n; j++)
sum++;
16
Code Segment 2
sum = 0;
1
for(i=0; i<n; i++)
2n+2
for(j=0; j<n; j++) 2n+2
sum++;
17
Code Segment 2
sum = 0;
for(i=0; i<n; i++)
for(j=0; j<n; j++)
sum++;
1
2n+2
2n+2
1
18
Code Segment 2
sum = 0;
for(i=0; i<n; i++)
for(j=0; j<n; j++)
sum++;
1
2n+2
2n+2
1
1 + (2n+2) + (2n+2)*n + n*n*1
Complexity?
19
Code Segment 3
sum = 0;
for(i=0; i<n; i++)
for(j=0; j<n*n; j++)
sum++;
1
2n+2
?
1
Complexity?
20
Code Segment 4
sum = 0;
for(i=0; i<=n; i++)
for(j=0; j<i; j++)
sum++;
1
2n+4
?
1
Complexity?
i=0
i=1
i=2
i=3
…
i=n
j=0
j=0
j=0
j=1
j=1
j=2
j=0
j=1
j=2
j=3 . . . j=n-1
21
How Do Running Times Compare?
3*2n
n2
3n-1
n-1
22
Towards “Big Oh”
t (time)
c f(n),
e.g. 5 x2
with c = 5, f(n)=x2
T(n) describes the
runtime of some
program,
e.g. T(n) = 2x2-4x+3
n (input size)
n0
We can observe that for an input size n ≥ n0 , the graph of the function c f(n)
has a higher time value than the graph for the function T(n).
 For n ≥ n0, c f(n) is an upper bound on T(n), i.e. c f(n) ≥ T(n).
23
Big-Oh [1]
• It is too much work to use the exact number of machine
instructions
• Instead, hide the details
– average number of compiler-generated machine
instructions
– average number of instructions executed by a machine per
second
• Simplification
– Instead of 4m-1 write O(m)
• O(m) ?!
24
Big-Oh [2]
• Restrict argument to integer 𝑛 ≥ 0
• 𝑇(𝑛) is nonnegative for all 𝑛
Definition:
𝑇(𝑛) is 𝑂(𝑓(𝑛)) if ∃ an integer 𝑛0 and a constant
𝑐 > 0: ∀ 𝑛 ≥ 𝑛0, 𝑇 𝑛 ≤ 𝑐 · 𝑓(𝑛)
∃ “there exists”
∀ “for all”
25
Big-Oh - Example [1]
Definition:
T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0:
∀ n ≥ n0 T(n) ≤ cf(n)
Example 1:
T(0) = 1
T(1) = 4
T(2) = 9
in general : T(n) = (n+1)2
Is T(n) also O(n2) ???
26
Big-Oh - Example [2]
Definition:
T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0:
∀ n ≥ n0 T(n) ≤ cf(n)
T(n)=(n+1)2. We want to show that T(n) is O(n2).
In other words, f(n) = n2
If this is true, there exist and integer n0 and a constant c > 0
such that for all n ≥ n0 : T(n) ≤ cn2
27
Big-Oh - Example [3]
Definition:
T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0:
∀ n ≥ n0 T(n) ≤ cf(n)
T(n) ≤ cn2 ⇔ (n+1)2 ≤ cn2
Choose c=4, n0=1: Show that (n+1)2 ≤ 4n2 for n ≥ 1
(n+1)2 = n2 + 2n + 1
≤ n2 + 2n2 + 1
= 3n2 + 1
≤ 3n2 + n2
= 4n2
= cn2
28
Big-Oh - Example [Alt 3]
Definition:
T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0:
∀ n ≥ n0 T(n) ≤ cf(n)
T(n) ≤ cn2 ⇔ (n+1)2 ≤ cn2
Choose c=2, n0=3: Show that (n+1)2 ≤ 2n2 for n ≥ 3
(n+1)2 =
≤
=
=
n2 + 2n + 1
n2 + n 2
2n2
For all n≥3: 2n+2 ≤ n2
cn2
29
Simplification Rules for Big-Oh
• Constant factors can be omitted
– O(54n2) = O(n2)
• Lower-oder terms can be omitted
– O(n4 + n2) = O(n4)
– O(n2) + O(1) = O(n2)
• Note that the highest-order term should never be
negative.
– Lower order terms can be negative.
– Negative terms can be omitted since they do not
increase the runtime.
30
Transitivity [1]
What is transitivity?
– if A☺B and B☺C, then A☺C
– example: a < b and b < c, then a < c
e.g. 2 < 4 and 4 < 7, then 2 < 7
since “<“ is transitive
Is Big Oh transitive?
31
Transitivity [2]
• if f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n))
– f(n) is O(g(n)):
∃ n1, c1 such that f(n) ≤ c1 g(n) ∀ n ≥ n1
– g(n) is O(h(n)):
∃ n2, c2 such that g(n) ≤ c2 h(n) ∀ n ≥ n2
– Choose n0 = max{n1,n2} and c = c1 c2
f(n) ≤ c1 g(n) ≤ c1 c2 h(n) ⇒ f(n) is O(h(n))
≤ c2 h(n)
Tightness
• Use constant factor “1”
• Use tightest upper bound that we can proof
– 3n is O(n2) and O(n) and O(2n)
Which one should we use?
Summation Rule [1]
Consider a program that that contains 2 parts
• Part 1 takes T1(n) time and is O(f1(n))
• Part 2 takes T2(n) time and is O(f2(n))
• We also know that f2 grows no faster than f1
⇒ f2(n) is O(f1(n))
• What is the running time of the entire
program?
• T1(n) + T2(n) is O(f1(n) + f2(n))
• But can we simplify this?
Summation Rule [2]
• T1(n) + T2(n) is O(f1(n)) since f2 grows no faster than f1
• Proof:
T1(n) ≤ c1 f1(n) for n ≥ n1
T2(n) ≤ c2 f2(n) for n ≥ n2
f2(n) ≤ c3 f1(n) for n ≥ n3
n0 = max{n1,n2,n3}
T1(n) + T2(n) ≤ c1 f1(n) + c2 f2(n)
= c1 f1(n) + c2 f2(n)
≤ c1 f1(n) + c2 c3 f1(n)
= c1 +c2 c3 f1(n)
= c f1(n) with c=c1+c2c3
⇒ T1(n) + T2(n) is O(f1(n))
Summation Rule - Example
//make A identity matrix
𝑂(1)
scanf("%d", &d);
for(i=0; i<n; i++)
for(j=0; j<n; j++)
𝑂(1)
A[i][j] = 0;
for(i=0; i<n; i++)
A[i][i] = 1;
𝑂(1)
𝑂(𝑛)
O(n2)
𝑂(𝑛)
O(1) + O(n2) + O(n) = O(n2)
36
Summary of Rules & Concepts [1]
• Worst-case, average-case, and best-case running time
are compared for a fixed input size n, not for varying n!
• Counting Instructions
– Assume 1 instruction for assignments, simple calculations,
comparisons, etc.
• Definition of Big-Oh
T(n) is O(f(n))
if ∃ an integer n0 and a constant c > 0:
∀ n ≥ n0 T(n) ≤ cf(n)
Summary of Rules & Concepts [2]
• Rule 1: Constant factors can be omitted
– Example: O(3n5) = O(n5)
• Rule 2: Low order terms can be omitted
– Example: O(3n5 + 10n4 - 4n3 + n + 1) = O(3n5)
• We can combine Rule 1 and Rule 2:
– Example: O(3n5 + 10n4 - 4n3 + n + 1) = O(n5)
Summary of Rules & Concepts [3]
• For O(f(n) + g(n)), we can neglect the function with the
slower growth rate.
– Example: O(f(n) + g(n)) = O(n + nlogn) = O(nlogn)
• Transitivity: If f(n) is O(g(n)) and g(n) is O(h(n))
then f(n) is O(h(n))
– Example: f(n)=3n, g(n)=n2, h(n)=n6
3n is O(n2) and n2 is O(n6)  3n is O(n6)
• Tightness: We try to find an upper bound Big-Oh that is
as small as possible.
– Example: n2 is O(n6), but is O(n2) is a much tighter (and
better) bound.
Solutions to Instruction
Counts on Code Segments
Instructions
Big Oh
Code Segment 1
3n + 3
O(n)
Code Segment 2
3n2 + 4n + 3
O(n2)
Code Segment 3
3n3 + 4n + 3
O(n3)
Code Segment 4
Argh!
O(n2)
40
Download