ppt

advertisement
Algorithmic Time Complexity
Basics
Shantanu Dutt
ECE Dept.
UIC
Time Complexity
•
•
•
•
•
An algorithm’s time complexity is a function T(n) of problem size n that
represents how much time the algorithm will take to complete its task.
Note that there could be more than one problem size parameter n, in which
case we can denote the time complexity function as T(S), where S is the set of
size parameters. E.g., for the shortest path problem on a graph G, we have 2
size parameters, n the # of vertices and e the # of edges (thus T(S) = T(n,e));
the foll. will make sense for 465 students: for the covering part of QuineMcCluskey (QM), we also have 2 size parameters, m the # of MTs and p the #
of PIs (T(S) = T(m,p)). In turn, both m and p have upper bounds that are
functions of n, the # of variables of the logic function whose cost is being min.
In general, the runtime of an algorithm/program is not only determined by the
algorithm, but also the processor speed, memory size and speed, bus speed,
etc. However, T(n) generally overlooks technology parameters, and focuses on
the intrinsic # of basic steps that an algorithm has to perform as a function of
n.
The main job of T(n) is representing how the algorithm’s runtime grows as a
function of n, as opposed to giving us an absolute time that the algorithm will
take to solve its problem.
Thus if T(n) = n, then runtime doubles as n doubles, if T(n) = n2 then runtime
increases by 4 times as n doubles ((2n)2) = 4n2), and if T(n) = 2n, then runtime
increases by an exponent of 2 (i.e., quadratically) as n doubles (22n = (2n)2).
Time Complexity
• T(n) is determined by counting the number, as a function of n, of
basic steps whose time to complete that do not depend upon the
input size (i.e., are constant-time operations) that the algorithm has
to perform. Note that different basic steps (e.g., addition,
multiplication of integers) may take different absolute times, but
each counts as 1 operation and these times are independent of n.
n
Count # of basic steps
that need to be
performed by
algorithm A that are
independent of n (i.e.,
are constant time
operations)
T(n)
Time Complexity: Big O Notation
• The Big O notation for T(n) specifies the closest or smallest
upper bound function f(n) for T(n) (denoted by T(n) = O(f(n)),
which essentially says that for a large enough n, T(n) <= f(n).
• Formally, we say that for two monotonic function T(n) and f(n),
T(n) = O(f(n)), if there exist a constant c, and an n0, s.t.
T(n) <= c*f(n), for n >= n0
• The following is a graphical representation of the T(n) = O(f(n))
relation:
f(n)
T(n)
Ack: Graph obtained from http://www.cs.cmu.edu/~adamchik/15-121/lectures/Algorithmic%20Complexity/complexity.html
Time Complexity: Big O Notation
• Example: T(n) = n2 + 2n + 3, then f(n) = n2, i.e., T(n) = O(n2).
Proof: For n >= 2 (= n0), T(n) = n2 + 2n + 3 < n2 + n2 + n2 = 3n2.
Thus for n0 = 2 and c = 3, T(n) <= c*n2 for n >= n0
• In general (but not always), to determine the big O complexity
term for T(n), determine its most dominant term (the term that
will be greater than all other terms for a large enough n), get
rid of all constants in the term, and if this simplified term is
f(n), then T(n) = O(f(n)). Note: If we can determine T(n) in this
manner, then actually T(n) is also Θ(f(n)).
• f(n) is also called the asymptotic complexity of T(n)
• Sometimes, it is important to retain the multiplicative
constants or exponentiation constants, or all constants (e.g., in
other complexity measures such as hardware cost). Thus it may
be important to not simplify 3n2 to n2. This depends on the
detail to which the algorithm is being analyzed and the
complexity of a competing algorithm.
Time Complexity: Big W, Q Notations
• Definition of "big Omega"
• We need the notation for the lower bound. A capital omega Ω
notation is used in this case. We say that T(n) = Ω(g(n)) when there
exists constant c that T(n) ≥ c*g(n) for all sufficiently large n (n >= n0).
Examples:
–
–
–
–
n = Ω(1)
n2 = Ω(n)
n2 = Ω(n log(n))
2 n + 1 = Ω (n)
• Definition of "big Theta"
• To measure the complexity of a particular algorithm, means to find
the upper and lower bounds. A new notation is used in this case. We
say that T(n) = Θ(g(n)) if and only T(n) = O(g(n)) and T(n) = Ω(g(n)). If
a worst-case analysis is quite exact in terms of the dominating term
in T(n), we can say T(n) = Θ(g(n)) rather than T(n) = O(g(n)).
Examples:
– 2 n = Θ(n)
– n2 + 2 n + 1 = Θ( n2)
Ack: Obtained from http://www.cs.cmu.edu/~adamchik/15-121/lectures/Algorithmic%20Complexity/complexity.html
Time Complexity: Big O, Q Notations
• Graphs illustrating Big O, Q Notations:
T(n)
T(n)
T(n)
T(n)
Figure: The asymptotical bounds O and Θ
Ack: Graph obtained from http://www.leda-tutorial.org/en/official/ch02s02s03.html
Time Complexity: Big O, Q Notations—When we can use Q and when we cannot
• Usage O, Q Notations: When our analysis of T(n) (or any counting function)
is exact at least for the dominating term, we can and should use Q.
Otherwise, if we can only determine an upper bound function we use the O
notation.
• The foll. examples will make sense for 465 students:
• Example 1:
– m(n) = worst-case # of MTs of a non-trivial n-variable function f( ) (!= 0 or 1). Certainly
m(n) < 2n. If we choose MTs that have an even # of 1’s in their binary representation, we
have 2n-1 = 0.5*2n MTs, and the function is non-trivial (it is an even-parity function). Thus
m(n) = O(2n).
– Q is whether we can say that m(n), in the worst-case, can be at least of the order of 2n-1.
Since our analysis above was exact, we can say, yes. More formally, if we choose a
constant c2 = 0.49, then the # of MTs in an even parity function > c2*2n, and this m(n) >
c2*2n. Thus we have m(n) = W (2n).
– From the above, m(n) = Q(2n).
• Example 2:
– p(n) = worst-case # of PIs of an n-variable function f( ). We know that across all n-variable
functions the # of PIs (or the # of ternary notations w/ symbols 0,1,X) are 3n. So we can
say, p(n) = O(3n).
– However, the above analysis may not be exact, in the sense that for this analysis at least,
we do not know for certain if there is any function that has p(n) of the order of 3n. Thus,
we cannot say that p(n) = W (3n), and this we cannot say that p(n) = Q(3n).
– Note that as has been shown earlier, we know that the max. # of PIs covering a MT can be
of the order of nCn/2 ~ 2n-1 (choose n/2 positions in the MTs binary repr. and make them
X’s to get an implicant that covers the MT; none of the implicants obtained this way cover
each other, so if these are the only set of uncovered implicants of a function, they are all
PIs of that function). Thus we can say that in the worst-case p(n) = W (2n). But since 3n is
not of the same order as 2n (3n = (2**n)(log 3)), i.e., 3n != Q (2n) (3n = W (2n), but 3n != O(2n)),
we cannot say that p(n) = W (3n).
Different Types of Time Complexity Analysis
• The term analysis of algorithms is used to describe approaches to the
study of the performance of algorithms. The most prevalent type
is worst-case runtime complexity of the algorithm is the function defined
by the maximum number of steps taken on any instance of size a.
• The best-case runtime complexity of the algorithm is the function defined
by the minimum number of steps taken on any instance of size a.
• The average-case runtime complexity of the algorithm is the function
defined by an average number of steps taken on any instance of size a.
• The amortized runtime complexity of the algorithm is the function
defined by a sequence of operations applied to the input of size a and
averaged over time.
• Example. Let us consider an algorithm of sequential searching for an
element in a sorted array of size n.
– Its worst-case runtime complexity is O(n) = Θ(n)
– Its best-case runtime complexity is O(1)
– Its average-case runtime complexity is O(n/2) = O(n) = Θ(n) —Why?
Ack: Obtained from http://www.cs.cmu.edu/~adamchik/15-121/lectures/Algorithmic%20Complexity/complexity.html
Examples of Time Complexity Analysis
. This also is = Q(n3).
Generally, important only to count “main” operations, here only additions and
multiplications: n3 additions and n3 multiplications, and we arrive at the same O
and Q notation complexities of n3
Ack: Adapted from http://www.cs.cornell.edu/courses/cs211/2005sp/Lectures/L14-BigO/L14-15-Complexity.4up.pdf
Examples of Time Complexity Analysis
Ack: Obtained from http://www.cs.cornell.edu/courses/cs211/2005sp/Lectures/L14-BigO/L14-15-Complexity.4up.pdf
Merge Sort (MS) Analysis
• One way to solve recurrences, is to open it up and obtain a series of terms to be
summed.
• Below, we obtain the complexity of merge sort directly (ignoring the
recurrence). The main basic operation is a comparison of 2 elements/numbers
Legend:
: computation break-up
arrow
: data transfer/dependency
arrow
Merge(n):
Worst-case comparisons = ?
Best-case comparisons = ?
MS(n)
Total comparisons @ level 1:
worst-case: ? best-case: ?
Merge(n)
MS(n/2)
Merge(n/2)
MS(n/4)
MS(n/2)
Total comparisons @ level 2:
worst-case: ? best-case: ?
MS(n/4)
Merge(n/2)
MS(n/4)
MS(n/4)
Total comparisons @ last merge level (?) :
worst-case: ? best-case: ?
1
1
Total worst-case complexity: ?
Total best-case complexity: ?
1
1
Importance of Asymptotic Analysis—Worst- & Average-Case
years
• Asymptotic analysis tells us whether a technique/algorithm will be practical in all
cases (worst-case analysis) or in the average-case (av.-case analysis) for problem sizes
of interest
Ack: Table obtained from http://www.csd.uwo.ca/courses/CS1037a/notes/topic13_AnalysisOfAlgs.pdf
Importance of Asymptotic Analysis—Worst- & Average-Case (contd.)
• Assume each basic oper. takes 1 ms
T(n); n
Table: T(n) values in ms’s
Ack: Obtained from http://www.cs.cornell.edu/courses/cs211/2005sp/Lectures/L14-BigO/L14-15-Complexity.4up.pdf
Importance of Asymptotic Analysis—Worst- & Average-Case (contd.)
T(n); n
Ack: Obtained from http://www.cs.cornell.edu/courses/cs211/2005sp/Lectures/L14-BigO/L14-15-Complexity.4up.pdf
Asymptotic Complexity and Efficient Algorithms
Ack: Obtained from http://www.cs.cornell.edu/courses/cs211/2005sp/Lectures/L14-BigO/L14-15-Complexity.4up.pdf
Concluding Remarks
Ack: Obtained from http://www.cs.cornell.edu/courses/cs211/2005sp/Lectures/L14-BigO/L14-15-Complexity.4up.pdf
Download