CENG 218 Design and Analysis of Algorithms Izmir Institute of Technology Lecture 2: Asymptotic Notation Growth of functions • Previously, we have seen to compute the running time of an algorithm in terms of input size, n. • The order of growth of that time, T(n), gives an idea about the algorithm’s efficiency. • Thus, we can compare the relative performance of alternative algorithms. • Useful in engineering for showing that one design scales better or worse than another. 2 Growth of functions When n gets large enough, a Q(n2) algorithm always beats a Q(n3) algorithm. T(n) n n0 3 Asymptotic performance • To make a comparison, we look at input sizes that are large enough. • That is, the size of the input in the limit. I.e. The growth of T(n) as n → ∞ . • This is why we call it ‘asymptotic’ performance. 4 Asymptotic performance: Example → • Suppose you are designing a web site to process database records of n users. • Program A takes TA(n)=30n+8 microseconds, while program B takes TB(n)=n2+1. • Which program do you choose? T(n) TA(n)=30n+8 TB(n)=n2+1 n → Asymptotic analysis • Asymptotic analysis is a useful tool to structure our thinking toward better algorithms. • It is independent of the computer/platform. • We shouldn’t ignore asymptotically slower algorithms, however. They may be faster for reasonable size input (when n0 is very large). • Real-world design situations often call for a careful balancing. 6 Q-notation Mathematical definition: f (n) = Q(g(n)) means (read = as ‘is’) there exist positive constants c1, c2, and n0 such that c1 g(n) f (n) c2 g(n) for all n n0 c2g(n) f(n) c1g(n) n0 f (n) is ‘sandwiched’ between c1 g(n) and c2 g(n) for big enough n 7 Q-notation c2g(n) f(n) c1g(n) n0 Example: f(n)=8n3+5n2+7. g(n)=n3. What may be c1, c2, n0 ? • c1 can be 1 as long as n0 is positive. • c2 =10 and n0 =3 is a proper pair. • Therefore, using (c1, c2, n0)=(1,10,3) we can say that 8n3+5n2+7= Q(n3) 8 Q-notation • If the definition is satisfied, we say that g(n) is an asymptotically tight bound for f(n). • Important: The c1, c2, n0 values that make the statement true are not unique. Any triple that satisfies the inequalities also work. 9 Facts about Q-notation • When a polynomial function is used, the leading (nth) term dominates its growth: f(x) = an xn + an-1 xn-1 +...+ a1 x+a0 , then f(x) = Q(xn). • Transitivity: f = Q(g) and g = Q(h) → f = Q(h) • Reflexivity: f = Q(f) • Symmetry: f = Q(g) g = Q(f) 10 More Q-notation facts • Sums of functions: If g = Q(f) and h = Q(f), then g+h = Q(f). • Operations with a constant: a>0, Q(af) = Q(f+a) = Q(f−a) = Q(f) • Combination of functions: f1 is Q(g1) and f2 is Q(g2) then, 1) f1 f2 is Q(g1g2) 2) f1+f2 is Q(g1+g2) = Q(max(g1,g2)) 11 Q-notation exercises 1) Give the Q estimate for (n log n + n 2 )(n3 + 2) 2 3 ( n log n + n )( n + 2) = Solution: (Q(n) Q(log n) + Q(n 2 )) Q(n3 ) = (Q(n log n) + Q(n 2 )) Q(n3 ) = Q(n 2 ) Q(n 3 ) = Q(n 5 ) 2) Give the Q estimate for 3n 4 + log 2 n8 Solution: 3n 4 + 8 log 2 n = Q(n 4 ) 12 O-notation (Big-O) • We say f (n) = O(g(n)) if there exist positive constants c and n0 such that 0 f (n) c g(n) for all n n0 – When n is greater than n0, function f is at most a constant c times function g • We say that g(n) is an asymptotic upper bound for f(n). • Following descriptions are also used: “f is at most order g”, or “f is big-O g” 13 Big-O definition exercise • Show that f(n) = 2n2+5n+9 is O(n2) by finding a pair of (c,n0) for the definition of the big-O. g(n) = n2 f(n) = 2n2+5n+9 For c=2, 2n2+5n+9 ≤ 2n2 is never true. Let c=3, then 2n2+5n+9 ≤ 3n2 is true when 5n+9 ≤ n2 which corresponds to n ≥ 7. Therefore, (c,n0)=(3,7) is a proper pair (not the only one) to show that f(n) is O(n2). 14 Some facts about big-O • Unlike Q, g(n) can be higher order than f(n). • Because big-O denotes upper bound, we can write 20n2+4 = O(n5). • It has transitivity: f = O(g) and g = O(h) → f = O(h) • It has reflexivity: f = O(f) • It does NOT have symmetry: f = O(g) does not imply g = O(f) 15 -notation (Big omega) • We say f (n) = (g(n)) if there exist positive constants c and n0 such that 0 c g(n) f (n) for all n n0 – When n is greater than n0, function f is at least a constant c times function g • We say that g(n) is an asymptotic lower bound for f(n). • Following descriptions are also used: “f is at least order g”, or “f is big- g” 16 Some facts about • Since denotes lower bound, g(n) can be lower order than f (n). E.g.: 20n2+4 = (n) E.g.: √n = (log2n). Is it? • Like big-O, has transitivity and reflexivity. • Like big-O, does NOT have symmetry. 17 Difference between O and Q estimates Question: Give the order of growth for the sum of the first n positive integers that are divisible by 3. Solution 1: f(n) = 3+6+9+….+3n < 3n +3n +3n +….+3n 3+6+9+….+3n < n·(3n) 3+6+9+….+3n < 3n2 f(n) = O(n2) With this solution you can not say f(n) = Q(n2). To find Q estimate, we need a closed-form formula. n Solution 2: (n + 1) n 3 k = 3 = (3 / 2) (n 2 + n) = Q(n 2 ) 2 k =1 18 Q(g), O(g), and (g) f (n) = Q(g (n)) ( f (n) = O(g (n))) ( f (n) = (g (n))) 19 Macro convention • “O(f)” when used as a term in an arithmetic expression means: “some function f such that f=O(f)”. • E.g.: “x2+O(x)” means “x2 plus some function that is O(x)”. 20 o-notation (little-o) • We use o-notation to denote that the asymptotic upper bound provided by big-O is NOT asymptotically tight. • E.g.: 2n = o(n2), but 2n2 ≠ o(n2) • f(n) = o(g(n)) if for any positive constant c>0, there exists n0 such that 0 f(n) < cg(n) for nn0 • Another view: f(n) = o(g(n)) means f ( n) =0 lim n → g ( n) 21 ω-notation (little omega) • We use ω-notation to denote that the asymptotic lower bound provided by Ω is NOT asymptotically tight. • E.g.: n2/2 = ω(n), but n2/2 ≠ ω(n2) • f(n) = ω(g(n)) if for any positive constant c>0, there exists n0 such that 0 cg(n) < f(n) for nn0 • Another view: f(n) = ω(g(n)) means f ( n) = lim n → g ( n) 22 More facts • Analogy with the comparison of two numbers: f(n) = O(g(n)) f(n) = Ω(g(n)) f(n) = Q(g(n)) f(n) = o(g(n)) f(n) = ω(g(n)) is like a b is like a b is like a = b is like a < b is like a > b • Transpose symmetry: f(n) = O(g(n)) if and only if g(n) = Ω(f(n)) f(n) = o(g(n)) if and only if g(n) = ω(f(n)) 23 Relations between Q(g), O(g), (g) • Subset relations between order-of-growth sets. R→R (g) O(g) •f o(g) Q(g) ω(g) • Big-omega and big-theta notations are introduced by Donald Knuth (1938-...) (c)2001-2003, Michael P. Frank 24 Exercises 1) 2n+1 = O(2n) T/F? True 2) f (n) + O(f(n)) = Q(f(n)) T/F? True 3) 2n = ω(2n-2) T/F? False 25 Terminology for the Growth of Functions • • • • • • • • Q(1) Q(logc n) Q(logc n) Q(n) Q(n log n) Q(nc) Q(cn), c>1 Q(n!) Constant Logarithmic (same order c) Polylogarithmic Linear n log n Polynomial Exponential Factorial 26 Ordering of Functions Let’s write fg for g is higher order than f and let k>1 then: 1 log n n n log n nk kn n! nn 27 Computer time examples #opers log n n n log n 2 n 2n n! n=10 3·10-9 s 10-8 s 3·10-8 s -7 10 s 10-6 s 3·10-3 s 6 n=10 2·10-8 s 10-3 s 2·10-2 s 17 min Assume: >10300,000 years 10−9 second **** per operation. 28 Quiz Which ones are more appealing? n1000 n2 2n Polynomials Which ones are more appealing? 1000n2 3n2 2n3 Quadratic L1.29 The End • Next, we will continue with Divide-and-Conquer approach to analyze recursive algorithms. • Please solve exercises of Chapter 3.1 and problems of Chapter 3. 30