pptx - The University of Texas at Arlington

advertisement
Analysis of Algorithms:
Methods and Examples
CSE 2320 – Algorithms and Data Structures
Vassilis Athitsos
Modified by Alexandra Stefan
University of Texas at Arlington
1
Big-Oh Notation
• A function g(N) is said to be O(f(N)) if there exist
constants c0 and N0 such that:
g(N) ≤ c0 f(N) for all N ≥ N0.
• Typically, g(N) is the running time of an algorithm, in
your favorite units, implementation, and machine.
This can be a rather complicated function.
• In algorithmic analysis, we try to find a f(N) that is
simple, and such that g(N) = O(f(N)).
2
Big-Oh Notation
• Upper bound: A function g(N) is said to be O(f(N)) if there
exist constants c0 and N0 such that:
g(N) ≤ c0 f(N) for all N ≥ N0.
• Lower bound: A function g(N) is said to be Ω(f(N)) if there
exist constants c0 and N0 such that:
c0 f(N) ≤ g(N) for all N ≥ N0.
• Tight bound: A function g(N) is said to be Θ(f(N)) if there exist
constants c0 , c1 and N0 such that:
c0 f(N) ≤ g(N) ≤ c1 f(N) for all N ≥ N0.
3
Properties of O, Ω and Θ
• g(N) = O(f(N)) => f(N) = Ω(g(N))
• g(N) = Ω(f(N)) => f(N) = O(g(N))
• g(N) = Θ(f(N)) => f(N) = Θ(g(N))
• If g(N) = O(f(N)) and g(N) = Ω(f(N)) => g(N) = Θ(f(N))
• Transitivity (proved in future slides):
– If 𝑔 𝑁 = 𝑂 𝑓 𝑁 and 𝑓 𝑁 = 𝑂(ℎ 𝑁 ), then
𝑔 𝑁 = 𝑂(ℎ 𝑁 ).
4
Using Limits
• if
𝑔(𝑁)
lim
𝑁→∞ 𝑓(𝑁)
is a constant, then g(N) = O(f(N)).
– "Constant" includes zero, but does NOT include infinity.
• if
𝑓(𝑁)
lim
𝑁→∞ 𝑔(𝑁)
= ∞ then g(N) = O(f(N)).
• if
𝑓(𝑁)
lim
𝑁→∞ 𝑔(𝑁)
is a constant, then g(N) = Ω(f(N)).
– Again, "constant" includes zero, but not infinity.
• if
𝑓(𝑁)
lim
𝑁→∞ 𝑔(𝑁)
is a non-zero constant, then g(N) = Θ(f(N)).
– In this definition, both zero and infinity are excluded.
5
Using Limits: An Example
𝑛5+3𝑛4+2𝑛3+𝑛2+𝑛+12
• Show that
5𝑛3+𝑛+3
= Θ(???).
6
Using Limits: An Example
𝑛5+3𝑛4+2𝑛3+𝑛2+𝑛+12
• Show that
5𝑛3+𝑛+3
• Let g
𝑛 =
• Let 𝑓(𝑛)
= Θ(𝑛2).
𝑛5+3𝑛4+2𝑛3+𝑛2+𝑛+12
5𝑛3+𝑛+3
= 𝑛2 .
𝑔(𝑛)
𝑛5 + 3𝑛4 + 2𝑛3 + 𝑛2 + 𝑛 + 12 1
lim
= lim
𝑛→∞ 𝑓(𝑛)
𝑛→∞
5𝑛3 + 𝑛 + 3
𝑛2
𝑛5 + 3𝑛4 + 2𝑛3 + 𝑛2 + 𝑛 + 12
1
= lim
=
5
3
2
𝑛→∞
5𝑛 + 𝑛 + 3𝑛
5
7
Using Limits: An Example
𝑛5+3𝑛4+2𝑛3+𝑛2+𝑛+12
• Show that
5𝑛3+𝑛+3
• Let g
𝑛 =
• Let 𝑓(𝑛)
= Θ(𝑛2).
𝑛5+3𝑛4+2𝑛3+𝑛2+𝑛+12
5𝑛3+𝑛+3
= 𝑛2 .
𝑔(𝑛)
• In the previous slide, we showed that lim
𝑛→∞ 𝑓(𝑛)
=
1
5
• Therefore, 𝑔(𝑛) = Θ(𝑓(𝑛)).
8
Big-Oh Transitivity
• If 𝑔 𝑁 = 𝑂 𝑓 𝑁 and 𝑓 𝑁 = 𝑂(ℎ 𝑁 ), then
𝑔 𝑁 = 𝑂(ℎ 𝑁 ).
• How can we prove that?
9
Big-Oh Hierarchy
•
•
•
•
1 = 𝑂(lg(𝑁))
lg(𝑁) = 𝑂(𝑁)
𝑁 = 𝑂 𝑁2
If 0 ≤ 𝑐 ≤d, then 𝑁𝑐 = 𝑂(𝑁𝑑).
– Higher-order polynomials always get larger than lowerorder polynomials, eventually.
• For any 𝑑, if 𝑐 > 1, 𝑁𝑑 = 𝑂 𝑐𝑁 .
– Exponential functions always get larger than polynomial
functions, eventually.
• You can use these facts in your assignments.
• You can apply transitivity to derive other facts, e.g.,
that lg(𝑁) = 𝑂(𝑁2).
10
Big-Oh Transitivity
• If 𝑔 𝑁 = 𝑂 𝑓 𝑁 and 𝑓 𝑁 = 𝑂(ℎ 𝑁 ), then
𝑔 𝑁 = 𝑂(ℎ 𝑁 ).
• How can we prove that? Using the definition of the
big-Oh notation.
• g(N) ≤ c0 f(N) for all N ≥ N0.
• f(N) ≤ c1 h(N) for all N ≥ N1.
• Set:
– c2 = c0 * c1
– N2 = max(N0, N1)
• Then, g(N) ≤ c2 h(N) for all N ≥ N2.
11
Using Substitutions
• If lim ℎ(𝑥) = ∞, then:
𝑥→∞
𝑔 𝑥 = 𝑂 𝑓 𝑥
⇒ 𝑔 ℎ(𝑥) = 𝑂(𝑓 ℎ(𝑥) ).
• How do we use that?
• For example, prove that lg
𝑁 = 𝑂( 𝑁).
12
Using Substitutions
• If lim ℎ(𝑥) = ∞, then:
𝑥→∞
𝑔 𝑥 = 𝑂 𝑓 𝑥
⇒ 𝑔 ℎ(𝑥) = 𝑂(𝑓 ℎ(𝑥) ).
• How do we use that?
• For example, prove that lg
• Use h N =
𝑁 = 𝑂( 𝑁).
𝑁. We get:
lg N = O N ⇒ lg
𝑁 =𝑂
𝑁
13
Summations
• Summations are formulas of the sort:
𝑛
𝑘=0 𝑓(𝑘)
• Computing the values of summations can be handy
when trying to solve recurrences.
• Oftentimes, establishing upper bounds is sufficient,
since we use big-Oh notation.
𝑛
∞
• If 𝑓 𝑘 ≥ 0 , then: 𝑘=0 𝑓 𝑘 ≤ 𝑘=0 𝑓 𝑘
• Sometimes, summing to infinity give a more simple
formula.
14
Geometric Series
• A geometric series is a sequence Ck of numbers, such
that Ck = D * Ck-1 , where D is a constant.
• How can we express C1 in terms of C0?
– C1 = D * C0
• How can we express C2 in terms of C0?
– C 2 = D * C 1 = D2 * C 0
• How can we express Ck in terms of C0?
– C k = Dk * C 0
• So, to define a geometric series, we just need two
parameters: D and C0.
15
Summation of Geometric Series
• This is supposed to be a review of material you have seen in
Math courses:
• Suppose that 𝟎 < 𝒙 < 𝟏:
• Finite summations:
• Infinite summations:
=
∞
𝑘
𝑥
𝑘=0
𝑛
𝑘
𝑥
𝑘=0
=
1
1−𝑥
∞
𝑘
𝑥
𝑘=0
≤
𝑛
𝑘
𝑘=0 𝑥 = 𝑂 1 . Why?
• Important to note:
Therefore,
𝑛
𝑘
𝑥
𝑘=0
1 − 𝑥 𝑛+1
1−𝑥
=
1
1−𝑥
16
Summation of Geometric Series
• This is supposed to be a review of material you have seen in
Math courses:
• Suppose that 𝟎 < 𝒙 < 𝟏:
• Finite summations:
• Infinite summations:
– Because
=
∞
𝑘
𝑥
𝑘=0
𝑛
𝑘
𝑥
𝑘=0
=
1
1−𝑥
1
1−𝑥
∞
𝑘
𝑥
𝑘=0
≤
𝑛
𝑘
𝑘=0 𝑥 = 𝑂 1 . Why?
• Important to note:
Therefore,
𝑛
𝑘
𝑥
𝑘=0
1 − 𝑥 𝑛+1
1−𝑥
is independent of n.
=
1
1−𝑥
17
Summation of Geometric Series
• Suppose that 𝑥 > 1: The formula for finite summations is the
same, and can be rewritten as:
•
𝑛
𝑘
𝑥
𝑘=0
=
𝑥 𝑛+1 −1
𝑥−1
• This can be a handy formula in solving recurrences:
• For example:
𝑛+1
5
−1
2
3
𝑛
1 + 5 + 5 + 5 + …+ 5 =
= 𝑂(5𝑛)
5−1
18
Approximation by Integrals
• Suppose that f(x) is a monotonically increasing
function:
– This means that 𝑥 ≤ 𝑦 ⇒ 𝑓 𝑥 ≤ 𝑓(𝑦).
• Then, we can approximate summation
𝑛+1
using integral 𝑚 𝑓 𝑥 𝑑𝑥.
• Why? Because 𝑓(𝑘) ≤
𝑘+1
𝑓
𝑘
𝑛
𝑘=𝑚 𝑓(𝑘)
𝑥 𝑑𝑥 .
• For every 𝑥 in the interval [𝑘, 𝑘 + 1], 𝑥 ≥ 𝑘. Since
𝑓(𝑥) is increasing, if 𝑥 ≥ 𝑘 then 𝑓 𝑥 ≥ 𝑓(𝑘).
19
Solving Recurrences: Example 1
• Suppose that we have an algorithm that at each step:
– takes O(N2) time to go over N items.
– eliminates one item and then calls itself with the
remaining data.
• How do we write this recurrence?
20
Solving Recurrences: Example 1
• Suppose that we have an algorithm that at each step:
– takes O(N2) time to go over N items.
– eliminates one item and then calls itself with the
remaining data.
• How do we write this recurrence?
• 𝑔(𝑁) = 𝑔(𝑁 − 1) + 𝑁2
= 𝑔(𝑁 − 2) + (𝑁 − 1)2 + 𝑁2
= 𝑔(𝑁 − 3) + (𝑁 − 2)2 + (𝑁 − 1)2 + 𝑁2
…
= 12 + 22 + … + 𝑁2
=
𝑁
2. How do we approximate that?
𝑘
𝑘=1
21
Solving Recurrences: Example 1
• We approximate
𝑵
𝟐
𝒌=𝟏 𝒌
using an integral:
• Clearly, 𝑓(𝑥) = 𝑥2 is a monotonically increasing
function.
• So,
𝑁
2
𝑘
𝑘=1
≤
=
𝑁+1 2
𝑁+1 3 −13
𝑥 𝑑𝑥 =
1
3
𝛮3+2𝛮2+2𝛮+1 −1
= 𝛩(𝑁3)
3
22
Solving Recurrences: Example 2
• Suppose that we have an algorithm that at each step:
– takes O(lg(N)) time to go over N items.
– eliminates one item and then calls itself with the
remaining data.
• How do we write this recurrence?
23
Solving Recurrences: Example 2
• Suppose that we have an algorithm that at each step:
– takes O(lg(N)) time to go over N items.
– eliminates one item and then calls itself with the
remaining data.
• How do we write this recurrence?
• 𝑔(𝑁) = 𝑔(𝑁 − 1) + lg(𝑁)
= 𝑔(𝑁 − 2) + lg(𝑁 − 1) + lg(𝑁)
= 𝑔(𝑁 − 3) + lg(𝑁 − 2) + lg(𝑁 − 1) + lg(𝑁)
…
= lg(1) + lg(2) + … + lg(𝑁)
=
𝑁
𝑘=1 𝑙𝑔(𝑘).
How do we compute that?
24
Solving Recurrences: Example 2
• We process 𝑵
𝒌=𝟏 𝒍𝒈(𝒌) using the fact that:
lg(𝑎) + lg(𝑏) = lg(𝑎𝑏)
•
N
k=1 lg(k)
= lg 1 + lg 2 + … + lg N
= lg(𝑁!)
≌
=
𝑁 𝑁
lg(( ) )
𝑒
𝑁
𝑁 lg( )
𝑒
= 𝑁 lg 𝑁 – 𝑁 lg 𝑒 = 𝑂(𝑁 lg 𝑁 )
25
Solving Recurrences: Example 3
• Suppose that we have an algorithm that at each step:
– takes O(1) time to go over N items.
– calls itself 3 times on data of size N-1.
– takes O(1) time to combine the results.
• How do we write this recurrence?
26
Solving Recurrences: Example 3
• Suppose that we have an algorithm that at each step:
– takes O(1) time to go over N items.
– calls itself 3 times on data of size N-1.
– takes O(1) time to combine the results.
• How do we write this recurrence?
• 𝑔(𝑁) = 3𝑔(𝑁 − 1) + 1
(why use ‘+1’ and not ‘+2’?)
= 32 𝑔 𝑁 − 2 + 3 + 1
= 33 𝑔 𝑁 − 3 + 32 + 3 + 1
…
= 3𝑁−1 𝑔 1 + 3𝑁−2 + 3𝑁−3 + 3𝑁−4 + ⋯ + 1
Note: 𝑔(1) is just a constant
finite summation
27
Solving Recurrences: Example 3
• Suppose that we have an algorithm that at each step:
– takes O(1) time to go over N items.
– calls itself 3 times on data of size N-1.
– takes O(1) time to combine the results.
• How do we write this recurrence?
• 𝑔(𝑁) = 3𝑔(𝑁 − 1) + 1
= 32 𝑔 𝑁 − 2 + 3 + 1
= 33 𝑔 𝑁 − 3 + 32 + 3 + 1
…
= 3𝑁−1 𝑔 1 + 3𝑁−2 + 3𝑁−3 + 3𝑁−4 + ⋯ + 1
= O 3𝑁 + O 3𝑁 = O(3𝑁)
28
Solving Recurrences: Example 4
• Suppose that we have an algorithm that at each step:
– calls itself N times on data of size N/2.
– takes O(1) time to combine the results.
• How do we write this recurrence?
29
Solving Recurrences: Example 4
• Suppose that we have an algorithm that at each step:
– calls itself N times on data of size N/2.
– takes O(1) time to combine the results.
• How do we write this recurrence? Let n = lg 𝑁.
• 𝑔(2𝑛) = 2𝑛 𝑔(2𝑛−1 ) + 1
= 2𝑛 2𝑛−1 𝑔(2𝑛−2 ) + 2𝑛 + 1
= 2𝑛 2𝑛−1 2𝑛−2 𝑔(2𝑛−3 ) + 2𝑛 2𝑛−1 + 2𝑛 + 1
𝑛
𝑛
𝑛
2𝑘 𝑔(2𝑛−3 ) + 1 +
=
𝑘=𝑛−2
2𝑖
𝑘=𝑛−1 𝑖=𝑘
30
Solving Recurrences: Example 4
• Suppose that we have an algorithm that at each step:
– calls itself N times on data of size N/2.
– takes O(1) time to combine the results.
• How do we write this recurrence? Let n = log 𝑁.
• 𝑔(2𝑛) = 2𝑛 𝑔(2𝑛−1 ) + 1
= 2𝑛 2𝑛−1 𝑔(2𝑛−2 ) + 2𝑛 + 1
= 2𝑛 2𝑛−1 2𝑛−2 𝑔(2𝑛−3 ) + 2𝑛 2𝑛−1 + 2𝑛 + 1
𝑛
𝑛
𝑛
2𝑘 𝑔(2𝑛−4 ) + 1 +
=
𝑘=𝑛−3
2𝑖
𝑘=𝑛−2 𝑖=𝑘
31
Solving Recurrences: Example 4
• Suppose that we have an algorithm that at each step:
– calls itself N times on data of size N/2.
– takes O(1) time to combine the results.
• How do we write this recurrence? Let n = log 𝑁.
• 𝑔(2𝑛) = 2𝑛 𝑔(2𝑛−1 ) + 1
= 2𝑛 2𝑛−1 𝑔(2𝑛−2 ) + 2𝑛 + 1
= 2𝑛 2𝑛−1 2𝑛−2 𝑔(2𝑛−3 ) + 2𝑛 2𝑛−1 + 2𝑛 + 1
𝑛
𝑛
2𝑘
=
𝑘=1
𝑛
2𝑖
𝑔 1 +1+
𝑘=2 𝑖=𝑘
32
Solving Recurrences: Example 4
𝑛
2𝑘 = 2𝑛 2𝑛−1 2𝑛−2 … 22 21
𝑘=1
= 2(𝑛+𝑛−1+𝑛−2+⋯+1)
=2
𝑛(𝑛+1)
2
=
𝑛+1
2𝑛 2
=
lg(𝑁)+1
𝑁 2
=
lg 𝑁
𝑂(𝑁 2
Substituting N for 2𝑛
)
33
Solving Recurrences: Example 4
𝑛
𝑖
2
𝑘=1
• Let X =
𝑛
𝑛
2𝑖
𝑘=2 𝑖=𝑘
𝑛
𝑘=2
𝑛
(which we have just computed).
𝑋 𝑋
<𝑋+ + + … ⇒
2 4
𝑛
𝑖
2
𝑖=𝑘
𝑛
2𝑖 =
< 2𝑋 ⇒ (taking previous slide into account)
log 𝑁
𝑂(𝑁 2
)
𝑘=2 𝑖=𝑘
34
Solving Recurrences: Example 4
• Based on the previous two slides, we can conclude
that the solution of: 𝑔(𝑁) = 𝑁𝑔(𝑁/2) + 1 is that:
𝑔 𝑁 =
lg 𝑁
𝑂(𝑁 2
)
35
Big-Oh Notation: Example Problem
• Is 𝑁 = 𝑂(sin 𝑁 𝑁 2 )?
• Answer:
36
Big-Oh Notation: Example Problem
Is 𝑁 = 𝑂(sin 𝑁 𝑁 2 )?
Answer: no!
Why? sin(𝑁) fluctuates forever between -1 and 1.
As a result, sin 𝑁 𝑁 2 fluctuates forever between
negative and positive values.
• Therefore, for every possible 𝑐0 ≥ 0 and 𝑁0, we can
always find an 𝑁≥ 𝑁0 such that:
•
•
•
•
𝑁 > 𝑐0 sin(𝑁)𝑁 2
37
Useful logarithm properties
• 𝑐 𝑙𝑔(𝑁) = 𝑁𝑙𝑔(𝑐)
– Proof: apply lg on both sides and you get two equal terms:
lg(𝑐 𝑙𝑔(𝑁) ) = lg (𝑁𝑙𝑔(𝑐) )
=>
lg(N) * lg(c) = lg(N) * lg(c)
– Can we also say that 𝑐 𝑁 = 𝑁 𝑐 ?
38
Download