Lecture 3 - Growth of Functions.ppt

advertisement
Growth of Functions
Jeff Chastine
Asymptotic Notation
• When we look at a large enough input n, we
are studying the asymptotic efficiency
• Concerned with the running time as the input
increases without bound
• Usually, an algorithm that is asymptotically
more efficient is the best choice (except for
small n)
Jeff Chastine
Θ-notation (‘Theta’)
• Θ(g(n))={ƒ(n):there exist positive constants c1, c2, and n0 such
that 0≤c1g(n) ≤ƒ(n) ≤c2g(n) for all n≥n0}.
• This means that g(n) and f(n) have the same running time; are
off by just a constant
• "Sandwiched" for sufficiently large n
• We say g(n) is an asymptotically tight bound for f(n)
Jeff Chastine
Big-Θ
Jeff Chastine
A Formal Definition
½n2 - 3n = Θ(n2)
c1n2 ≤ ½n2 - 3n ≤ c2n2 for all n≥n0.
Divide by n2 yields
c1 ≤ ½ - 3/n ≤ c2
Jeff Chastine
Why 6n3 ≠ Θ(n2)
Suppose 6n3 ≤ c2n2 for all n≥n0
Divide each side by 6n2
n ≤ c2/6, which isn't true for all n!
Divide each side by 6n3
1 ≤ c2/n, which isn't true for all n!
Jeff Chastine
Practice
• Do this now!
• Prove 3n2 + n = Θ (n2)
Jeff Chastine
Note
• Now you can see:
– Why constants (coefficients) don't matter
– Why lesser terms don't matter
d
∑
i
ain
=Θ
d
(n )
i=0
Jeff Chastine
Ο-notation (Big-Oh)
• Ο(g(n))={ƒ(n):there exist positive constants c and n0 such that
0≤ƒ(n) ≤cg(n) for all n≥n0}.
• This means that g(n) is always greater than f(n) at some point
• We say g(n) is an asymptotic upper bound for f(n)
• Associated with worst-case running time
Jeff Chastine
Big-Oh
Jeff Chastine
A Formal Definition
10n2 - 3n = Ο(n3)
10n2 - 3n ≤ c2n3 for all n≥n0.
Divide by n2 yields
10 - 3/n ≤ c2n
Jeff Chastine
Ω-notation (‘Omega’)
• Ω(g(n))={ƒ(n):there exist positive constants c and n0 such that
0≤cg(n) ≤ƒ(n) for all n≥n0}.
• This means that g(n) is always less than f(n) at some point
• We say g(n) is an asymptotic lower bound for f(n)
• Associated with best-case running time
Jeff Chastine
Big-Ω
Jeff Chastine
Insertion Sort
(revisited)
• Running time falls between Ω(n) and Ο(n2)
• Is the worst-case scenario of insertion sort
Ω(n2)
Jeff Chastine
ο-notation (little-oh)
• ο(g(n))={ƒ(n): for any positive constant c, there
exists a constant n0 such that 0≤ƒ(n) ≤cg(n) for
all n≥n0}.
• Example: 2n = ο(n2), but 2n2 ≠ ο(n2)
• This notation is not asymptotically tight
Jeff Chastine
ω-notation (little omega)
• ω(g(n))={ƒ(n): for any positive constant c,
there exists a constant n0 > 0 such that
0 ≤ cg(n) ≤ƒ(n) for all n≥n0}.
• Example: n2/2 = ω(n), but n2/2 ≠ ω(n2)
• This notation is not asymptotically tight
Jeff Chastine
Standard Notation
• A function is monotonically increasing if m ≤ n
implies ƒ(m) ≤ ƒ(n)
• There is also monotonically decreasing
• The floor of a number rounds down x
└
┘
• The ceiling of a number rounds up ┌x┐
Jeff Chastine
Jeff Chastine
Download