Algorithm Analysis • To do analysis, we need a model that tells us how much each basic operation costs. • RAM (Random Access Machine) model • RAM model has all the basic operations: +, -, *, / , =, comparisons. • fixed sized integers (e.g., 32-bit) • infinite memory • All (basic) operations take exactly one time unit to execute. 1 Critique of the model • Weaknesses: – not all operations take the same amount of time in a real machine – does not account for page faults, disk accesses, etc. • Strengths: – simple – easier to prove things about the model than the real machine 2 Why models? model real world • Idea: (Useful) statements we can make about the model can also be made about a real computer 3 Relative rates of growth • Distinguish between fast- and slow-growing functions. n3 + 2n2 100n2 + 1000 4 Big-Oh notation • Defn: T(N) = O(f(N)) if there are positive constants c and n0 such that T(N) c f(N) for all N n0 . • Idea: We are concerned with how the function grows when N is large. We are not picky about constant factors: coarse distinctions among functions. • Lingo: “T(N) grows no faster than f(N).” 5 Examples • n = O(2n) ? • 2n = O(n) ? • n = O(n2) ? • n2 = O(n) ? • n = O(1) ? • 100 = O(n) ? • 10 log n = O(n) ? • 214n + 34 = O(2n2 + 8n) ? 6 Big Omega, Theta • Defn: T(N) = (g(N)) if there are positive constants c and n0 such that T(N) c g(N) for all N n0 . • Lingo: “T(N) grows no slower than g(N).” • Defn: T(N) = (h(N)) if and only if T(N) = O(h(N)) and T(N) = (h(N)). • Big-Oh, Omega, and Theta establish a relative order among all functions of N. 7 Intuition, little-Oh notation intuition O (Big-Oh) “” (Big-Omega) “” (Theta) “=“ o (little-Oh) “<“ • Defn: T(N) = o(p(N)) if T(N) = O(p(N)) and T(N) (p(N)). 8 More about Asymptotics • Fact: If f(N) = O(g(N)), then g(N) = (f(N)). • Proof: Suppose f(N) = O(g(N)). Then there exist constants c and n0 such that f(N) c g(N) for all N n0. 1 Then g(N) c f(N) for all N n0, and so g(N) = (f(N)). 9 More terminology • Suppose T(N) = O(f(N)). f(N) is an upper bound on T(N). T(N) grows no faster than f(N). • Suppose T(N) = (g(N)). g(N) is a lower bound on T(N). T(N) grows at least as fast as g(N). • If T(N) = o(h(N)), then we say that T(N) grows strictly slower than h(N). 10 Style • If f(N) = 5N, then f(N) = O(N5) f(N) = O(N3) f(N) = O(N) preferred f(N) = O(N log N) • Ignore constant multiplicative factors and low order terms. – T(N) = O(N), not T(N) = O(5N). – T(N) = O(N3), not T(N) = O(N3 + N2 + N log N). • Bad style: f(N) O(g(N)). Wrong: f(N) O(g(N)). 11 Facts about Big-Oh • If T1(N) = O(f(N)) and T2(N) = O(g(N)), then – T1(N) + T2(N) = O(f(N) + g(N)). – T1(N) * T2(N) = O(f(N) * g(N)). • If T(N) is a polynomial of degree k, then T(N) = (Nk). • log k N = O(N), for any constant k. (Why?) 12 Techniques • Algebra ex. f(N) = N / log N, g(N) = log N. same as asking which grows faster, N or log 2 N. f (N ) • Evaluate lim g ( N ) . N limit is 0 c0 no limit Big-Oh relation f(N) = o(g(N)) f(N) = (g(N)) g(N) = o(f(N)) ??? 13 Techniques, cont’d • L’Hôpital’s rule: If lim f ( N ) N lim g ( N ) and , then N f (N ) f '(N ) lim lim N g ( N ) N g ' ( N ) . example: f(N) = N, g(N) = log N. Use L’Hôpital’s rule. f ’(N) = 1, g’(N) = 1/N. (sort of) g(N) = o(f(N)). 14